Tag Archives: Crossdomain

Upload S3 files directly with AJAX

One great feature of S3 is that you can upload directly to your s3 bucket through a POST action, the flow is very simple, you only need to generate some params of where is going to be uploaded and the permissions/meta, with the Amazon PHP SDK, you can easily do like:

        require_once 'sdk.class.php';
	$s3_uploader = new S3BrowserUpload();
	$params = $s3_uploader->generate_upload_parameters($bucket, $expires, array(
		'acl' => AmazonS3::ACL_PUBLIC,
		'key' => $path_file,
		'Content-Type' => $mime,
	));

The $params will be an array, whose $params['input'] needs to be passed as hidden form values, and finally you need to set a file input where user will select the file (at end of all other params):

<input type="file" name="file" />

You can also specify a redirection URL to get back to your page after user have uploaded.

This work fine for “traditional uploads”, but what if you want to take advantage of new API’s like storing string content into a file; or grab the canvas current image and directly save into S3 without need to send to a server; or make a thumbnail in the fly and upload directly to S3?

The first problem here is that you cannot do crossdomain callings, since S3 don’t have the feature of adding a header of Access-Control-Origin which will allow to directly post, but fortunately with postMessage you can upload a HTML and a javascript to your S3 and act as “proxy” to it.

The other problem is that postMessage only allows types that can be cloned, due security a File/FileReader cannot be cloned so you need to pass directly the contents instead the object.

In this example I will allow you to

  1. Select an image
  2. Resize to half through canvas
  3. Get the base64 encoded binary
  4. Pass to postmessage with s3 POST data
  5. Decode the base64
  6. Pass through ajax to S3 subdomain

In fact is very simple and they are using standard features, but unfortunately as HTML5 is an still working feature on most browser currently only work on Firefox 4+ and Chrome 11+

First let’s review the part that will be stored in your s3 bucket and serve as a proxy, we will have three files:

  1. index.html
    Will contain only code to load the javascript and nothing more
  2. base64-binary.js
    It’s an handful script which will allow to decode a base64 image into an ArrayBuffer which can be converted into a Blob and send as a “file” in a POST message
  3. upload.js
    This script will contain the logic and will receive the message and convert the base64 encoded image with POST params and convert into an AJAX request

You can see the full files here:
https://github.com/danguer/blog-examples/tree/master/html5/s3/direct-upload/s3-files

The core part of the upload.js is:

window.addEventListener("message", function(event) {
	var data = event.data; //get the data

	//upload data through a blob
	var separator = 'base64,';
	var index = data.content.indexOf(separator);
	if (index != -1) {
		var bb = new BlobBuilder();

		//decode the base64 binary into am ArrayBuffer
		var barray = Base64Binary.decodeArrayBuffer(data.content.substring(index+separator.length));
	    bb.append(barray); 

	    var blob = bb.getBlob();

	    //pass post params through FormData
	    var formdata = new FormData();

		for (var param_key in data.params) {
			formdata.append(param_key, data.params[param_key]);
		}
		formdata.append("file", blob, "myblob.png"); //add the blob

		//finally post the file through AJAX
		var xhr = new XMLHttpRequest();
		xhr.open("POST", data.url, true);
		xhr.send(formdata);
	}
}, false);

As you can see we are only decoding the base64 and convertingo into a Blob, passing all the params into a FormData and upload through AJAX.

As this file will be stored in S3 there won’t be any crossdomain issue, but you need to be able to create a Blob and a FormData to make it work.

 

On the script to upload we are going to have a bit more code, first we need a code to generate the params for S3, this is done easily in php with:

<?php
	//init aws-sdk
	require_once 'sdk.class.php';

	//response will be in json
	header('Content-Type: application/json');	

	$bucket = 'danguer-blog'; //your s3 bucket
	$bucket_path = 'upload-html5/tmp'; //"dir" where is going to be stored inside the bucket
	$filename = isset($_POST['name'])?$_POST['name']:null;
	$filemime = isset($_POST['mime'])?$_POST['mime']:null;

	//handle errors
	if (empty($filename)) {
		print json_encode(array(
			'error' => 'must provide the filename',
		));
		return;
	}

	if (empty($filemime)) {
		print json_encode(array(
			'error' => 'must provide the mime',
		));
		return;
	}

	if (strpos($filename, '..') !== false) {
		print json_encode(array(
			'error' => 'not relative paths',
		));
		return;
	}

	$expires = '+15 minutes'; //token will be valid only for 15 minutes
	$path_file = "{$bucket_path}/{$filename}";
	$mime = $filemime; //help the browsers to interpret the content	

	//get the params for s3 to upload directly
	$s3_uploader = new S3BrowserUpload();
	$params = $s3_uploader->generate_upload_parameters($bucket, $expires, array(
		'acl' => AmazonS3::ACL_PUBLIC,
		'key' => $path_file,
		'Content-Type' => $mime,
	));

	print json_encode(array(
		'error' => '',
		'url' => "http://{$params['form']['action']}",
		'params' => $params['inputs']
	));
	return;

 

Next, we need to present the file input and create the iframe pointing to the S3 files we have uploaded:

	<!-- hiden frame -->
	<iframe id="postMessageFrame" src="<?=$url_iframe?>">
	</iframe>

	<h3>Upload Files</h3>
	<input type="file" accept="image/*" onchange="uploadFile(this.files)">

And the javascript code is a bit more, since once selected we need to resize it

function resizeImage(file, mime) {
	var canvas = document.createElement("canvas");
    var ctx = canvas.getContext("2d");
    var canvasCopy = document.createElement("canvas");
    var ctxCopy = canvasCopy.getContext("2d");

	var reader = new FileReader();
    reader.onload = function() {
	    var image = new Image();
	    image.onload = function() {
		    //scale just at half
		    canvasCopy.width = image.width;
	        canvasCopy.height = image.height;
	        ctxCopy.drawImage(image, 0, 0);

	        canvas.width = image.width * 0.5;
	        canvas.height = image.height * 0.5;
	        ctx.drawImage(canvasCopy, 0, 0, canvasCopy.width, canvasCopy.height, 0, 0, canvas.width, canvas.height);

	        //convert into image and get as binary base64 encoded
	        //to pass through postMessage
	        var url = canvas.toDataURL(mime, 0.80);
	        uploadImage(file, url)
	    }

        image.src = reader.result;
    };

	//read contents
	reader.readAsDataURL(file);
}

As you can see we are opening the file selected, drawing into a canvas, copy into other canvas of the half of size and getting a base64 binary in the same mimetype of the file, finally we are calling to upload the file:

function uploadImage(file, dataURL) {
	//load signature
	var signature_params = {
		mime: file.type,
		name: file.name
	};

	$.post(url_ajax_signature, signature_params, function(response) {
		//send through crossdomain page
		var windowFrame = document.getElementById('postMessageFrame').contentWindow ;
		var data = {
			params: response.params,
			url: response.url,
			content: dataURL
		}

		//send data of s3 request signature and base64 binary data
		windowFrame.postMessage(data, 'http://<?=$url_iframe_host?>');
	}, 'json');
}

We are first getting the POST params for S3 through AJAX from the PHP file and later we are going to send those data through a postMessage to the S3 iframe so it can process and upload to S3 without need of send first the file to server and later from there upload into S3, with this you can upload directly to S3 from client’s browser and if needed you can later check the contents to avoid injecting any non wanted files or malware.

All the code can be seen here:

https://github.com/danguer/blog-examples/tree/master/html5/s3/direct-upload

Share