Upload S3 files directly with AJAX

One great feature of S3 is that you can upload directly to your s3 bucket through a POST action, the flow is very simple, you only need to generate some params of where is going to be uploaded and the permissions/meta, with the Amazon PHP SDK, you can easily do like:

        require_once 'sdk.class.php';
	$s3_uploader = new S3BrowserUpload();
	$params = $s3_uploader->generate_upload_parameters($bucket, $expires, array(
		'acl' => AmazonS3::ACL_PUBLIC,
		'key' => $path_file,
		'Content-Type' => $mime,
	));

The $params will be an array, whose $params['input'] needs to be passed as hidden form values, and finally you need to set a file input where user will select the file (at end of all other params):

<input type="file" name="file" />

You can also specify a redirection URL to get back to your page after user have uploaded.

This work fine for “traditional uploads”, but what if you want to take advantage of new API’s like storing string content into a file; or grab the canvas current image and directly save into S3 without need to send to a server; or make a thumbnail in the fly and upload directly to S3?

The first problem here is that you cannot do crossdomain callings, since S3 don’t have the feature of adding a header of Access-Control-Origin which will allow to directly post, but fortunately with postMessage you can upload a HTML and a javascript to your S3 and act as “proxy” to it.

The other problem is that postMessage only allows types that can be cloned, due security a File/FileReader cannot be cloned so you need to pass directly the contents instead the object.

In this example I will allow you to

  1. Select an image
  2. Resize to half through canvas
  3. Get the base64 encoded binary
  4. Pass to postmessage with s3 POST data
  5. Decode the base64
  6. Pass through ajax to S3 subdomain

In fact is very simple and they are using standard features, but unfortunately as HTML5 is an still working feature on most browser currently only work on Firefox 4+ and Chrome 11+

First let’s review the part that will be stored in your s3 bucket and serve as a proxy, we will have three files:

  1. index.html
    Will contain only code to load the javascript and nothing more
  2. base64-binary.js
    It’s an handful script which will allow to decode a base64 image into an ArrayBuffer which can be converted into a Blob and send as a “file” in a POST message
  3. upload.js
    This script will contain the logic and will receive the message and convert the base64 encoded image with POST params and convert into an AJAX request

You can see the full files here:
https://github.com/danguer/blog-examples/tree/master/html5/s3/direct-upload/s3-files

The core part of the upload.js is:

window.addEventListener("message", function(event) {
	var data = event.data; //get the data

	//upload data through a blob
	var separator = 'base64,';
	var index = data.content.indexOf(separator);
	if (index != -1) {
		var bb = new BlobBuilder();

		//decode the base64 binary into am ArrayBuffer
		var barray = Base64Binary.decodeArrayBuffer(data.content.substring(index+separator.length));
	    bb.append(barray); 

	    var blob = bb.getBlob();

	    //pass post params through FormData
	    var formdata = new FormData();

		for (var param_key in data.params) {
			formdata.append(param_key, data.params[param_key]);
		}
		formdata.append("file", blob, "myblob.png"); //add the blob

		//finally post the file through AJAX
		var xhr = new XMLHttpRequest();
		xhr.open("POST", data.url, true);
		xhr.send(formdata);
	}
}, false);

As you can see we are only decoding the base64 and convertingo into a Blob, passing all the params into a FormData and upload through AJAX.

As this file will be stored in S3 there won’t be any crossdomain issue, but you need to be able to create a Blob and a FormData to make it work.

 

On the script to upload we are going to have a bit more code, first we need a code to generate the params for S3, this is done easily in php with:

<?php
	//init aws-sdk
	require_once 'sdk.class.php';

	//response will be in json
	header('Content-Type: application/json');	

	$bucket = 'danguer-blog'; //your s3 bucket
	$bucket_path = 'upload-html5/tmp'; //"dir" where is going to be stored inside the bucket
	$filename = isset($_POST['name'])?$_POST['name']:null;
	$filemime = isset($_POST['mime'])?$_POST['mime']:null;

	//handle errors
	if (empty($filename)) {
		print json_encode(array(
			'error' => 'must provide the filename',
		));
		return;
	}

	if (empty($filemime)) {
		print json_encode(array(
			'error' => 'must provide the mime',
		));
		return;
	}

	if (strpos($filename, '..') !== false) {
		print json_encode(array(
			'error' => 'not relative paths',
		));
		return;
	}

	$expires = '+15 minutes'; //token will be valid only for 15 minutes
	$path_file = "{$bucket_path}/{$filename}";
	$mime = $filemime; //help the browsers to interpret the content	

	//get the params for s3 to upload directly
	$s3_uploader = new S3BrowserUpload();
	$params = $s3_uploader->generate_upload_parameters($bucket, $expires, array(
		'acl' => AmazonS3::ACL_PUBLIC,
		'key' => $path_file,
		'Content-Type' => $mime,
	));

	print json_encode(array(
		'error' => '',
		'url' => "http://{$params['form']['action']}",
		'params' => $params['inputs']
	));
	return;

 

Next, we need to present the file input and create the iframe pointing to the S3 files we have uploaded:

	<!-- hiden frame -->
	<iframe id="postMessageFrame" src="<?=$url_iframe?>">
	</iframe>

	<h3>Upload Files</h3>
	<input type="file" accept="image/*" onchange="uploadFile(this.files)">

And the javascript code is a bit more, since once selected we need to resize it

function resizeImage(file, mime) {
	var canvas = document.createElement("canvas");
    var ctx = canvas.getContext("2d");
    var canvasCopy = document.createElement("canvas");
    var ctxCopy = canvasCopy.getContext("2d");

	var reader = new FileReader();
    reader.onload = function() {
	    var image = new Image();
	    image.onload = function() {
		    //scale just at half
		    canvasCopy.width = image.width;
	        canvasCopy.height = image.height;
	        ctxCopy.drawImage(image, 0, 0);

	        canvas.width = image.width * 0.5;
	        canvas.height = image.height * 0.5;
	        ctx.drawImage(canvasCopy, 0, 0, canvasCopy.width, canvasCopy.height, 0, 0, canvas.width, canvas.height);

	        //convert into image and get as binary base64 encoded
	        //to pass through postMessage
	        var url = canvas.toDataURL(mime, 0.80);
	        uploadImage(file, url)
	    }

        image.src = reader.result;
    };

	//read contents
	reader.readAsDataURL(file);
}

As you can see we are opening the file selected, drawing into a canvas, copy into other canvas of the half of size and getting a base64 binary in the same mimetype of the file, finally we are calling to upload the file:

function uploadImage(file, dataURL) {
	//load signature
	var signature_params = {
		mime: file.type,
		name: file.name
	};

	$.post(url_ajax_signature, signature_params, function(response) {
		//send through crossdomain page
		var windowFrame = document.getElementById('postMessageFrame').contentWindow ;
		var data = {
			params: response.params,
			url: response.url,
			content: dataURL
		}

		//send data of s3 request signature and base64 binary data
		windowFrame.postMessage(data, 'http://<?=$url_iframe_host?>');
	}, 'json');
}

We are first getting the POST params for S3 through AJAX from the PHP file and later we are going to send those data through a postMessage to the S3 iframe so it can process and upload to S3 without need of send first the file to server and later from there upload into S3, with this you can upload directly to S3 from client’s browser and if needed you can later check the contents to avoid injecting any non wanted files or malware.

All the code can be seen here:

https://github.com/danguer/blog-examples/tree/master/html5/s3/direct-upload

Share
  • Guest

    You can now pass a File to postMessage as of Firefox 8.

  • Hojda Viorel

    Hello. Can you provide a working copy of this or is t to much to ask? I’ve tried to make it work, but I don’t know PHP, so I got stuck there :( . What is “url_ajax_signature” signature? Is it the path to a pho page done using the Amazon PHP SDK? Alo, the name of the function is uploadImage, then uploadFile.

    Also, I think

    var signature_params = {
    mime: file.type,
    name: file.name
    };

    is wrong, it should be

    var signature_params = {
    mime: file[0].type,
    name: file[0].name
    };

    If you have a working copy and it’s not too much to ask, can you please detail a little bit?

    Cheers!

  • http://www.tweetegy.com/2012/01/save-an-image-file-directly-to-s3-from-a-web-browser-using-html5-and-backbone-js/ Save an image file directly to S3 from a web browser using HTML5 and Backbone.js | Tweetegy

    [...] Example from AWS Upload S3 files directly with AJAX Another example from [...]

  • Alec Koumjian

    FileReader is not currently supported by Safari. Are you aware of an alternative?

  • http://www.danguer.com Daniel Guerrero

    Unfortunately there is not alternative for Safari, or IE in HTML5/Javascript, An alternative will be fallback to a flash (or similar technology) component

  • http://www.danguer.com Daniel Guerrero

    Hi, I don’t have a working example, just the git sources, to setup you must setup first the aws sdk for php http://aws.amazon.com/sdkforphp/ and upload the s3 files to your bucket; and change the setup configuration.

    The url_ajax_signature is the path where your php script is

  • Andrew Hayes

    Thanks for the great post, I’ve used this to successfully get images scaled/cropped in the client (Chrome & Firefox) to multiple sizes and uploaded to S3 using this approach for a Django site I work on.

    However, I have a strange issue where the XHR sent from upload.js (located in the S3 bucket, imported by the HTML in the S3 bucket, which is in turn imported into a webpage on my web-server on it’s own non-S3 domain), creates the blob, packages the FormData with the S3 parameters (signed policy, signature etc.), but then returns an error after it’s completed.

    Looking on the web, the error returned is indicative of a cross-domain error (completed=4, status=0, responseText/responseXML are blank, no other information is provided).

    However when I look in the S3 bucket, the resized image is successfully stored ?:-/

    Any thoughts? Do you see this behaviour in your implementations?

    Thanks,

    Andy

  • Maca21free

    Same problem…:(

  • Wondering Coder

    hi, how do you apply in your code if the credentials I have are just the temporary credentials which consist of temp_accessKey, temp_secretKey and temp_token?

    Please help,

    Thanks