Realtime notifications – Pusher

Recent webapps keep pushing into realtime notifications, starting from web chat (for example Facebook Chat), notifications of new items (Twitter), progress status update (for example encoding a file).

The first solution was to provide an ajax poll; which can’t scale easily (a bad implementation could lead to your app to a DDoS) and will have a delay between updates.

On the server side are starting a lot of libraries/servers to provide a solution for this. The great news is WebSockets will be the standard and solution for the realtime notification from browser and server side. The bad news is that WebSockets is changing a lot and is not yet stable across major browsers (Firefox, Safari and Chrome implements it, but different versions)

A good approach has been cloud implementations which removes all the problems of programming and maintaining server resources for this task.

So far I’ve tested two implementations: Beaconpush (which I liked for fast implementation and low message sending, but now it’s planned to close in following months and only allow installed setups) and Pusher

Pusher.com is really a great solution to realtime notifications. The team is very responsive and want to help you all times.

I will detail some advantages and some main problems you could face if you need to make a choice.

Advantages:

  • Cheap, specially if you are going to send a lot of messages
  • Easy to implement in server and client side
  • Uses websockets available in most browsers (it uses the recently Mozilla implementation for example and never had a problem of connection in different versions of Safari/Chrome)
  • Has auth mechanisms which allows to secure your messages (In Beaconpush it was easy to eavesdrop if you don’t hide the user id)
  • Has presence channels which mean you can know the other users in channel without handling at server side
  • You can store data when user is authorized (for example html parsed data) so you can use to render/store in your client side
  • Webhooks which let you know when the channel is empty or occupied (logged/offline if you use an “user channel”)

Disvantages

  • Not sure if they really limit user connections, because if you have a large site with few notifications you could easily hit limits of user connections (I see really low those limits)
  • All is channels, you don’t handle user id, even the solution is easy if you create “user channels” (private channels only for user id and handle authorization)
  • They don’t have api for batch sending a message to multiple channels
  • From server side you cannot get list of users logged in a channel (even will be hard to find as you don’t have user id param, you need to match with a socket_id they provide)
  • Webhooks are great for ajax pages, but won’t work “out of the box” if your user browses several pages as will be get a lot of notifications of user login/logout (they have a solution for client side, but server side you need to implement your own solution)
  • In my dev setup if I left the normal settings I got a delay on connection up to 10 seconds; if I force to use secure sockets worked faster and fine:
    new Pusher(applicationKey, {encrypted: true});

I really think Pusher is a great solution from small to medium sites; since it will allow to make a faster development and realtime notifications have a very good impact on your application.

 

 

Share

Upload S3 files directly with AJAX

One great feature of S3 is that you can upload directly to your s3 bucket through a POST action, the flow is very simple, you only need to generate some params of where is going to be uploaded and the permissions/meta, with the Amazon PHP SDK, you can easily do like:

        require_once 'sdk.class.php';
	$s3_uploader = new S3BrowserUpload();
	$params = $s3_uploader->generate_upload_parameters($bucket, $expires, array(
		'acl' => AmazonS3::ACL_PUBLIC,
		'key' => $path_file,
		'Content-Type' => $mime,
	));

The $params will be an array, whose $params['input'] needs to be passed as hidden form values, and finally you need to set a file input where user will select the file (at end of all other params):

<input type="file" name="file" />

You can also specify a redirection URL to get back to your page after user have uploaded.

This work fine for “traditional uploads”, but what if you want to take advantage of new API’s like storing string content into a file; or grab the canvas current image and directly save into S3 without need to send to a server; or make a thumbnail in the fly and upload directly to S3?

The first problem here is that you cannot do crossdomain callings, since S3 don’t have the feature of adding a header of Access-Control-Origin which will allow to directly post, but fortunately with postMessage you can upload a HTML and a javascript to your S3 and act as “proxy” to it.

The other problem is that postMessage only allows types that can be cloned, due security a File/FileReader cannot be cloned so you need to pass directly the contents instead the object.

In this example I will allow you to

  1. Select an image
  2. Resize to half through canvas
  3. Get the base64 encoded binary
  4. Pass to postmessage with s3 POST data
  5. Decode the base64
  6. Pass through ajax to S3 subdomain

In fact is very simple and they are using standard features, but unfortunately as HTML5 is an still working feature on most browser currently only work on Firefox 4+ and Chrome 11+

First let’s review the part that will be stored in your s3 bucket and serve as a proxy, we will have three files:

  1. index.html
    Will contain only code to load the javascript and nothing more
  2. base64-binary.js
    It’s an handful script which will allow to decode a base64 image into an ArrayBuffer which can be converted into a Blob and send as a “file” in a POST message
  3. upload.js
    This script will contain the logic and will receive the message and convert the base64 encoded image with POST params and convert into an AJAX request

You can see the full files here:
https://github.com/danguer/blog-examples/tree/master/html5/s3/direct-upload/s3-files

The core part of the upload.js is:

window.addEventListener("message", function(event) {
	var data = event.data; //get the data

	//upload data through a blob
	var separator = 'base64,';
	var index = data.content.indexOf(separator);
	if (index != -1) {
		var bb = new BlobBuilder();

		//decode the base64 binary into am ArrayBuffer
		var barray = Base64Binary.decodeArrayBuffer(data.content.substring(index+separator.length));
	    bb.append(barray); 

	    var blob = bb.getBlob();

	    //pass post params through FormData
	    var formdata = new FormData();

		for (var param_key in data.params) {
			formdata.append(param_key, data.params[param_key]);
		}
		formdata.append("file", blob, "myblob.png"); //add the blob

		//finally post the file through AJAX
		var xhr = new XMLHttpRequest();
		xhr.open("POST", data.url, true);
		xhr.send(formdata);
	}
}, false);

As you can see we are only decoding the base64 and convertingo into a Blob, passing all the params into a FormData and upload through AJAX.

As this file will be stored in S3 there won’t be any crossdomain issue, but you need to be able to create a Blob and a FormData to make it work.

 

On the script to upload we are going to have a bit more code, first we need a code to generate the params for S3, this is done easily in php with:

<?php
	//init aws-sdk
	require_once 'sdk.class.php';

	//response will be in json
	header('Content-Type: application/json');	

	$bucket = 'danguer-blog'; //your s3 bucket
	$bucket_path = 'upload-html5/tmp'; //"dir" where is going to be stored inside the bucket
	$filename = isset($_POST['name'])?$_POST['name']:null;
	$filemime = isset($_POST['mime'])?$_POST['mime']:null;

	//handle errors
	if (empty($filename)) {
		print json_encode(array(
			'error' => 'must provide the filename',
		));
		return;
	}

	if (empty($filemime)) {
		print json_encode(array(
			'error' => 'must provide the mime',
		));
		return;
	}

	if (strpos($filename, '..') !== false) {
		print json_encode(array(
			'error' => 'not relative paths',
		));
		return;
	}

	$expires = '+15 minutes'; //token will be valid only for 15 minutes
	$path_file = "{$bucket_path}/{$filename}";
	$mime = $filemime; //help the browsers to interpret the content	

	//get the params for s3 to upload directly
	$s3_uploader = new S3BrowserUpload();
	$params = $s3_uploader->generate_upload_parameters($bucket, $expires, array(
		'acl' => AmazonS3::ACL_PUBLIC,
		'key' => $path_file,
		'Content-Type' => $mime,
	));

	print json_encode(array(
		'error' => '',
		'url' => "http://{$params['form']['action']}",
		'params' => $params['inputs']
	));
	return;

 

Next, we need to present the file input and create the iframe pointing to the S3 files we have uploaded:

	<!-- hiden frame -->
	<iframe id="postMessageFrame" src="<?=$url_iframe?>">
	</iframe>

	<h3>Upload Files</h3>
	<input type="file" accept="image/*" onchange="uploadFile(this.files)">

And the javascript code is a bit more, since once selected we need to resize it

function resizeImage(file, mime) {
	var canvas = document.createElement("canvas");
    var ctx = canvas.getContext("2d");
    var canvasCopy = document.createElement("canvas");
    var ctxCopy = canvasCopy.getContext("2d");

	var reader = new FileReader();
    reader.onload = function() {
	    var image = new Image();
	    image.onload = function() {
		    //scale just at half
		    canvasCopy.width = image.width;
	        canvasCopy.height = image.height;
	        ctxCopy.drawImage(image, 0, 0);

	        canvas.width = image.width * 0.5;
	        canvas.height = image.height * 0.5;
	        ctx.drawImage(canvasCopy, 0, 0, canvasCopy.width, canvasCopy.height, 0, 0, canvas.width, canvas.height);

	        //convert into image and get as binary base64 encoded
	        //to pass through postMessage
	        var url = canvas.toDataURL(mime, 0.80);
	        uploadImage(file, url)
	    }

        image.src = reader.result;
    };

	//read contents
	reader.readAsDataURL(file);
}

As you can see we are opening the file selected, drawing into a canvas, copy into other canvas of the half of size and getting a base64 binary in the same mimetype of the file, finally we are calling to upload the file:

function uploadImage(file, dataURL) {
	//load signature
	var signature_params = {
		mime: file.type,
		name: file.name
	};

	$.post(url_ajax_signature, signature_params, function(response) {
		//send through crossdomain page
		var windowFrame = document.getElementById('postMessageFrame').contentWindow ;
		var data = {
			params: response.params,
			url: response.url,
			content: dataURL
		}

		//send data of s3 request signature and base64 binary data
		windowFrame.postMessage(data, 'http://<?=$url_iframe_host?>');
	}, 'json');
}

We are first getting the POST params for S3 through AJAX from the PHP file and later we are going to send those data through a postMessage to the S3 iframe so it can process and upload to S3 without need of send first the file to server and later from there upload into S3, with this you can upload directly to S3 from client’s browser and if needed you can later check the contents to avoid injecting any non wanted files or malware.

All the code can be seen here:

https://github.com/danguer/blog-examples/tree/master/html5/s3/direct-upload

Share

Base64 Binary Decoding in Javascript

Currently all the base64 decoders in javascript are for strings, not suitable for binary data, the common example is if you uses a canvas element and get their base64 representation ( canvas.toDataURL() ) usually you will upload and in the server do the base64 decode, if you want to process the data in javascript you will find the data will got corrupted since it’s processed as string.

This script is using the new javascript array typed, the basic usage is decoding the base64 into an Uint8Array you can call like:

var uintArray = Base64Binary.decode(base64_string);

The other is wrap the Uint8Array into an ArrayBuffer, this is very helpful for example to upload a file through FormData ( formdata.append("file", arraybuffer, "test.png") ), you can use with:

var byteArray = Base64Binary.decodeArrayBuffer(base64_string);

You can check the code here:
https://github.com/danguer/blog-examples/blob/master/js/base64-binary.js

Share

Login through Google+ and OAuth API

One of the common scenarios is to login through an external account like Google+, Twitter or Facebook.
In this code snippet I will show how to login using Google+ API

First you need to register your api at https://code.google.com/apis/console/, register the Google+ API Service and create a new application; in the “Redirect URI” you need to pass the path of the script and add the following: “googleplus.php?op=redirect” I will comment why is this later, after you see the code, you can fragment code into other more elegant paths/scripts.

From the generated application you will need the Redirection URI and the Client ID

Google OAuth works in the following way:

  1. Generate an URL using params of services you want to access (scope) and your Client ID and Redirection URI
  2. You need to redirect user to this URL
  3. After user has approved the use, you will be redirected with a hash params, so your server won’t see it, you can pass to your server through ajax or in this case through a GET request

In the code the “?op=redirect” will show just a white page with a javascript redirection to convert the hash location into a normal GET request; this will enable in the server to grab the access_token which you can use to verify through Google+ site if is valid.
This code part does it:

<?php
        $access_token = $_GET['access_token'];

	//do something with the token, first check is real
	$data = @file_get_contents("https://www.googleapis.com/plus/v1/people/me?access_token={$access_token}");
	if ($data) {
		print $data;
	} else {
		print "Token not valid!";
	}

The code is not prepared to handle error (will show as a $_GET['error'] after have redirected) so you need to handle that for your scenario.

Links:

 

Share

Using php for analyzing apache logs

Apache has a nice feature that is to send the log output through a pipe. This avoid to configure syslog or create a listening server in php for syslog forwarding logs.

The changes in apache are really simple, you just need to write something like:

LogFormat "%v %A %D \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"" mylog
CustomLog "|/usr/bin/php5 [PATH_TO_SCRIPT]/apache-stdin.php log" mylog
ErrorLog "|/usr/bin/php5 [PATH_TO_SCRIPT]/apache-stdin.php error"

In apache-stdin.php the flow is very simple, you just need to do:

<?php
$fp = fopen('php://stdin', 'r');
do {
	//read a line from apache, if not, will block until have it
	$data = fgets($fp);
	$data = trim($data); //remove line end

	if (empty($data)) {
		break; //no more data so finish it
	}

	//process the data
} while(true);

fclose($fp);

As you can see it’s basically reading a line and processing.

I’ve built a helper script around this at:
https://github.com/danguer/blog-examples/blob/master/php/syslog/apache-stdin.php

Where you can pass an additional param to specify it’s a normal log, or the error log; like in the apache configuration I’ve posted.

You can also configure the log format you are using in apache to get you a simple description like:

<?php
$format = '%v %A %D \"%r\" %>s %O \"%{Referer}i\" \"%{User-Agent}i\"';

With this, it will give you an array like:

array(
'hostname' => 'danguer.com',
'local_ip' => 127.0.0.1,
'time_ms' => 0.0002,
'first_line_request' => 'GET / HTTP/1.1',
'status_last' => 200,
'bytes_sent' => 2048
);

From there you can use for storing in a file (it’s the common behavior of this script), insert into a db, use simpledb, etc.

Share