locked
Uploading large files to Azure blob storage using the Biztalk WCF-WebHttp adapter RRS feed

  • Question

  • I am uploading files to Azure blob storage via a Biztalk send port using the WCF-WebHttp transport type.  This works fine for smaller files, but when I increase the size eg a 100mb file, I get a problem:

    System.Net.WebException: The remote server returned an unexpected response: (413) The request body is too large and exceeds the maximum permissible limit.

    <?xml version="1.0" encoding="utf-8"?><Error><Code>RequestBodyTooLarge</Code><Message>The request body is too large and exceeds the maximum permissible limit.
    RequestId:52ef8d47-0001-0062-3508-3176a1000000
    Time:2016-10-28T10:49:04.1319216Z</Message><MaxLimit>67108864</MaxLimit></Error>

    On the Send port transport properties I have changed the "Maximum received message size (bytes)" to 1,073,741,824 ie 1gb.  However the maxlimit error still occurs.

    As I understand it there are some additional settings "under the hood" on this transport type ie maxBufferPoolSize, maxBufferSize (in addition to the maxReceivedMessageSize), and also there is the transferMode of "Buffered", "Streaming" etc.  BUt these cannot be set on the WCF-WebHttp transport type.  I was thinking perhaps changing one of these might help but can't access them.

    Anyone have any suggestions for uploading large files to Azure blob storage?

    Thanks


    




    • Edited by Bill Winder Friday, October 28, 2016 10:53 AM
    Friday, October 28, 2016 10:53 AM

Answers

All replies

  • Hello Bill,

    You can refer an old thread: Cannot consume WCF service. (413) Request Entity Too Large
    Try removing named binding configuration and use the default one.

    Refer the article: https://blogs.msdn.microsoft.com/dsnotes/2015/08/21/large-file-upload-failure-for-web-application-calling-wcf-service-413-request-entity-too-large/

    You should consider configuring WCF verbose level tracing for the service and client. For more information on configuring WCF tracing please see the link – https://msdn.microsoft.com/en-us/library/ms733025%28v=vs.110%29.aspx

    Check the IIS request Filtering and set the Maximum allowed content length to higher value. Also there is a setting present in the IIS – “UploadReadAheadSize” that prevents upload and download of data greater than 49KB.The value present by default is 49152 bytes and can be increased up to 4 GB.

    Another that what is mentioned in the article you should also consider setting httpRuntime Element- maxRequestLength with appropriate value in your web.config.

    Refer discussion here.


    Rachit Sikroria (Microsoft Azure MVP)

    Friday, October 28, 2016 11:04 AM
    Moderator
  • Based on the error :

    System.Net.WebException: The remote server returned an unexpected response: (413) The request body is too large and exceeds the maximum permissible limit.

    May be you can try checking at the Azure settings to check what is the size limit there. It seems more of the Azure settings than the settings on ur send port.


    Pi_xel_xar

    Blog: My Blog

    BizTalkApplicationDeploymentTool: BizTalk Application Deployment Tool/

    Friday, October 28, 2016 11:29 AM
    Answerer
  • System.Net.WebException: The remote server returned an unexpected response: (413) The request body is too large and exceeds the maximum permissible limit.

    The error suggests the gap is on the Azure endpoint side.
    Friday, October 28, 2016 11:47 AM
    Moderator
  • Thanks. however there isn't any settings I can make on the Azurage storage around blob file size, these are set by Microsoft themselves and are much larger than the file I am uploading.

    https://azure.microsoft.com/en-gb/documentation/articles/azure-subscription-service-limits/#storage-limits

    In addition I can upload large files to the same Azure storage using pure .net methods, it is only when I use the Biztalk wcf-webhttp adapter that I encounter the error.

    Friday, October 28, 2016 12:13 PM
  • Thanks. however there isn't any settings I can make on the Azurage storage around blob file size, these are set by Microsoft themselves and are much larger than the file I am uploading.

    https://azure.microsoft.com/en-gb/documentation/articles/azure-subscription-service-limits/#storage-limits

    In addition I can upload large files to the same Azure storage using pure .net methods, it is only when I use the Biztalk wcf-webhttp adapter that I encounter the error.

    Hi Bill,

    This is documented :

    The maximum size for a block blob created via Put Blob is 64 MB. If your blob is larger than 64 MB, you must upload it as a set of blocks. For more information, see the Put Block and Put Block List operations. It's not necessary to also call Put Blob if you upload the blob as a set of blocks.

    If you attempt to upload a block blob that is larger than 64 MB, or a page blob larger than 1 TB, the service returns status code 413 (Request Entity Too Large). The Blob service also returns additional information about the error in the response, including the maximum blob size permitted in bytes.

    Please refer


    Pi_xel_xar

    Blog: My Blog

    BizTalkApplicationDeploymentTool: BizTalk Application Deployment Tool/

    Friday, October 28, 2016 1:01 PM
    Answerer
  • Well I have discovered that the maximum file size to write a file to azure storage in a single PUT operation is 64mb

    https://msdn.microsoft.com/library/azure/ee691964.aspx

    So this means the file must be broken into small chunks, of no more than 4mb each and then committed (using PUT BLOCK and PUT BLOCK LIST operations).  Don't think the WCF-WebHttp or WCF-Custom transport types are going to be able to handle that however.


    Friday, October 28, 2016 1:04 PM
  • Not sure.. May be not by default. Try checking if there are any properties at the adapter level hat can chunk the message in smaller blocks.

    Pi_xel_xar

    Blog: My Blog

    BizTalkApplicationDeploymentTool: BizTalk Application Deployment Tool/

    Friday, October 28, 2016 1:07 PM
    Answerer
  • Not sure.. May be not by default. Try checking if there are any properties at the adapter level hat can chunk the message in smaller blocks.

    Pi_xel_xar

    Blog: My Blog

    BizTalkApplicationDeploymentTool: BizTalk Application Deployment Tool/

    Pi_xel_xar, would you like to help out as an Answerer in this forum? Please review the guidelines and then accept on this thread; https://social.msdn.microsoft.com/Forums/en-US/a9ef4c76-69a4-4516-9f8c-09479f37e616/needed-more-answerers?forum=biztalkgeneral

    Thanks!


    Ed Price, Azure Development Customer Program Manager (Blog, Small Basic, Wiki Ninjas, Wiki)

    Answer an interesting question? Create a wiki article about it!

    Friday, October 28, 2016 2:30 PM
    Owner
  • Thanks, there are no options on the adapter level to chunk into small blocks - I am using the WCF-WebHttp adapter.

    I looked at using the WCF-Custom adapter as an alternative (with wbHttpBinding selected as the bindingtype), since this has settings to do with buffer size amongst other things.

     However it is not possible to set the http method and url mappings on the WCF-Custom adapter, which is an essential for blob storage, so this was a no go (I posted another question about this)


    • Edited by Bill Winder Wednesday, November 2, 2016 4:52 PM
    Wednesday, November 2, 2016 4:51 PM
  • Hi ,

    I am facing the same issue now not able to upload file size of more than 64 MB. Please let me know if you found solution for this issue

    Tuesday, August 6, 2019 10:23 AM