2 MB file upload to AWS S3 fails with Play framework used for uploading multiple files

858 Views Asked by At

I am getting this error and not sure where actually its leading. When I am trying to upload a file of 100 KB ; it uploads to S3 nicely but for a 2MB file, it fails with no serious exception but this:

[trace] play - Sending simple result: SimpleResult(200, Map(Content-Type -> text/html;    charset=utf-8, Set-Cookie -> ))
[trace] application - Exception caught in Netty
java.lang.ClassCastException: scala.runtime.BoxedUnit cannot be cast to scala.Function0
     at     play.core.server.netty.PlayDefaultUpstreamHandler.channelDisconnected(PlayDefaultUpstreamHa n dler.scala:50) ~[play.play_2.10-2.1.0.jar:2.1.0]
    at org.jboss.netty.handler.codec.replay.ReplayingDecoder.cleanup(ReplayingDecoder.java:570) ~[io.netty.netty-3.6.3.Final.jar:na]
    at org.jboss.netty.handler.codec.frame.FrameDecoder.channelDisconnected(FrameDecoder.java:365) ~[io.netty.netty-3.6.3.Final.jar:na]
    at org.jboss.netty.channel.Channels.fireChannelDisconnected(Channels.java:396) ~[io.netty.netty-3.6.3.Final.jar:na]
    at org.jboss.netty.channel.socket.nio.AbstractNioWorker.close(AbstractNioWorker.java:336) ~[io.netty.netty-3.6.3.Final.jar:na]
      at     org.jboss.netty.channel.socket.nio.NioServerSocketPipelineSink.handleAcceptedSocket(NioServ    erSocketPipelineSink.java:81) ~[io.netty.netty-3.6.3.Final.jar:na]
[trace] play - Http request received by netty: DefaultHttpRequest(chunked: false)
GET /favicon.ico HTTP/1.0
X-Real-IP: X.X.X.X
X-Scheme: http
X-Forwarded-For: X.X.X.X, X.X.X.X
Host: 
Connection: close
Accept: */*
Accept-Language: en-GB,en-US;q=0.8,en;q=0.6
Cookie: s_nr=1369981333181; PLAY_SESSION=f672ed541f895539f476

It doesnt even reach the web server and fails and throws me 500 page not found error; since it closes the connection abrubltly.

I am using Play Framework 2.0 and using AWS S3 HL Multipart upload. I wait till the upload completed in my service.

Please let me know what could be the issue; or any lead to thi?

This is my code:

        //Setting expiration time for the files in temp bucket to 24 hours [1 Day]
        ObjectMetadata objectMetadata = new ObjectMetadata();
        objectMetadata.setExpirationTime(DateTime.now().plusDays(1).toDate());
        objectMetadata.setContentLength(UploadData.getFileSize());

        //Setting user data
        userData = new HashMap<>();
        userData.put(UserMetaData.filename.name(), UploadData.getFileName());
        if (!NullChecker.isEmpty(UploadData.getFileSize())) {
            userData.put(UserMetaData.size.name(), String.valueOf(UploadData.getFileSize()));
        }
        if (!NullChecker.isEmpty(UploadData.getSerialNo())) {
            userData.put(UserMetaData.serialno.name(), UploadData.getSerialNo());
        }
        objectMetadata.setUserMetadata(userData);

        // TransferManager processes all transfers asynchronously,
        // so this call will return immediately.
        upload = transferManager.upload(bucketName, key, UploadData.getFileInputStream(), objectMetadata);

        try {
            // Or you can block and wait for the upload to finish
            upload.waitForCompletion();
        } catch (AmazonClientException amazonClientException) {
            logger.info("Unable to upload file, upload was aborted.");
            amazonClientException.printStackTrace();
        }
1

There are 1 best solutions below

0
AngryJS On

Am very much pleased with responses I received here [actually none]; I am not sure if none of you geeks knew the issue or infact had any inputs: but anyways, it got resolved and here is the answer:

The issue was, in our unix machine, Ngnix was configured and when checked Ngnix error logs, found :

[error] 25556#0: *52 client intended to send too large body:

which was not allowing even 2 MB files to be uploaded. after changing the configuration to a higher size limit; it started working. below code will help you:

Changes in php.ini

To change max file upload size to 100MB

vim /etc/php5/fpm/php.ini

Set…

upload_max_filesize = 100M
post_max_size = 100M

Thanks,