AzureCopy

Update (24/12/2016)

A newer version of AzureCopy is being developed and IT’S CROSS PLATFORM! 🙂

I’ve had numerous requests to have a Linux and/or a MacOS version, well things are being developed. Currently it’s VERY alpha but is slowly having features added. I’ll start adding binaries to the github repo (note, different repo to the regular AzureCopy).

I’ll be posting some blog posts about the upcoming changes. Both versions of AzureCopy will be developed (regular dotnet for Windows and the Golang crossplatform version).

Now, back to your regular content 🙂

 

Updated, new, improved and generally neatened up AzureCopy documentation!

AzureCopy is a both a command line tool and a Nuget package which can be used to copy blobs (files) between Azure, S3, Skydrive Onedrive, Sharepoint Online, Dropbox and your local filesystem.

Command

Once the command is configured (see below) copying between cloud providers is simple and easy.

Examples:

To copy a blob from Amazon S3 to Azure Blob storage:

azurecopy –i https://mybucket.s3.amazonaws.com/ –o https://myaccount.blob.core.windows.net/mycontainer/myblob

To copy a blob from Azure to Dropbox:

azurecopy –i https://myaccount.blob.core.windows.net/mycontainer/myblob –o https://dropbox.com/mydirectory/

To copy a blob from Azure to Onedrive:

azurecopy –i https://myaccount.blob.core.windows.net/mycontainer/myblob –o one://myfolder

To copy a blob from Onedrive to Sharepoint Online:

azurecopy –i one://myfolder/myblob –o “https://testuser.sharepoint.com/Shared Documents”

and finally… to copy from my local drive to Azure:

azurecopy –i “c:\temp\myfile” –o https://myaccount.blob.core.windows.net/mycontainer/

Any combination of the above is allowed as well. So although AzureCopy has “Azure” in the name, it can certainly be used for many other purposes!

If the target is Azure then AzureCopy is able to make use of the CopyBlob API. This allows Azure itself to perform the copying between the source URL and the Azure target URL. This means that although you may execute the command on your own machine in your own network, the bandwidth (say for copying from S3 to Azure) is purely between Amazons and Microsofts datacentres. Your bandwidth is NOT used!  To enable this feature, just make sure the target destination (specified with the –o) is an Azure URL and that you also add –blobcopy onto the command parameters.

For example:

azurecopy –i –https://mybucket.s3.amazonaws.com/ –o https://myaccount.blob.core.windows.net/mycontainer/myblob –blobcopy

Configuration

Configuration of the command line tool is primarily done through the azurecopy.exe.config ( aka app.config for any .NET followers out there), although any parameter that can be configured in the config file can also be passed on the command line if so desired. But be warned, most cloud storage providers like long alphanumeric tokens/keys/secrets so having these in a config file is usually less hassle.

An example of the config file is (well the interesting parts):

<add key=”AzureAccountKey” value=””/>

This is the Azure Account Key taken from the Azure portal. This will be used as the default source or target azure keys. This key needs to match the Azure blob URL used in the copying command.

<add key=”SrcAzureAccountKey” value=””/>

This is the Azure Account Key taken from the Azure portal. This will be used as the source azure key. This key needs to match the source Azure blob URL used in the copying command.

<add key=”TargetAzureAccountKey” value=””/>

This is the Azure Account Key taken from the Azure portal. This will be used as the target azure key. This key needs to match the target Azure blob URL used in the copying command.

<add key=”AWSAccessKeyID” value=”” />

This is the Amazon S3 Access Key ID taken from the S3 Portal. This will be used as a default if the source/target versions of this parameter are not supplied.

<add key=”AWSSecretAccessKeyID” value=””/>

This is the Amazon S3 Secret Access Key taken from the S3 Portal. This will be used as a default if the source/target versions of this parameter are not supplied.

<add key=”AWSRegion value=””/>

This is the Amazon S3 Region for the account specified above. This will be used as a default if the source/target versions of the parameter are not supplied.

Valid regions can be read taken from Amazons S3 region page.

<add key=”SrcAWSAccessKeyID” value=”” />

This is the Amazon S3 Access Key ID taken from the S3 Portal. This will be used as credentials when S3 is the source of the copying command.

<add key=”SrcAWSSecretAccessKeyID” value=””/>

This is the Amazon S3 Secret Access Key taken from the S3 Portal. This will be used as credentials when S3 is the source of the copying command.

<add key=”SrcAWSRegion value=””/>

This is the Amazon S3 Region for the account specified above. This will be used as the region when S3 is the source of the copying command.

Valid regions can be read taken from Amazons S3 region page.

<add key=”TargetAWSAccessKeyID” value=”” />

This is the Amazon S3 Access Key ID taken from the S3 Portal. This will be used as credentials when S3 is the target of the copying command.

<add key=”SourceAWSSecretAccessKeyID” value=””/>

This is the Amazon S3 Secret Access Key taken from the S3 Portal. This will be used as credentials when S3 is the target of the copying command.

<add key=”TargetAWSRegion value=””/>

This is the Amazon S3 Region for the account specified above. This will be used as the region when S3 is the target of the copying command.

Valid regions can be read taken from Amazons S3 region page.

<add key=”SkyDriveCode” value=””/>

This is the Onedrive code retrieved after authenticating with Onedrive through a browser. Please execute: “azurecopy –configonedrive” for more details. Yes, this is currently a little confusing due to the settings being called Skydrive as opposed to Onedrive.

This is simply due to the Microsoft renaming of the product. This may eventually get changed here.

<add key=”SkyDriveRefreshToken” value=””/>

This is the Onedrive refresh token which is required to be valid for access to Onedrive . This is populated when executing azurecopy and does not require manual configuration.

<add key=”DropBoxAPIKey” value=”” />

This is the DropBox API Key used to authenticate against Dropbox. For the binary AzureCopy command this parameter is not in the config file but is hardcoded into the executable. This is due to “AzureCopy” being the registered app against Dropbox. If you are using the source/Nuget package for your own app development then you’d populate this value in the config file.

For the command please execute: “azurecopy –configdropbox” for more details.

<add key=”DropBoxAPISecret” value=””/>

This is the DropBox API Secret used to authenticate against Dropbox. See above….

<add key=”SharepointUsername” value=”test@test.onmicrosoft.com” />

This is the Sharepoint username used to authenticate against Sharepoint Online.

<add key=”SharepointPassword” value=””/>

This is the password for the above.

Nuget Package

See Nuget for details.

Will document more soon.

Github

See Github for details.

Will document more soon.

84 thoughts on “AzureCopy

    • It looks as if onedrive/skydrive isn’t configured properly. Did you firstly run the command: azurecopy -configonedrive

      Then follow the instructions? It should display a url that you need to put into a browser (for onedrive to authorize azurecopy to access your data).

      At no stage should you need to manually modify the azurecopy.exe.config, the -configonedrive command should do that for you.

      I realise its a little cumbersome but it’s the only way I have it working for a command line tool currently. I can chat on google+/twitter etc if that’s easier to debug this if you need more help.

    • Ahh oops. I’ve documented that on my blog and pages about AzureCopy but obviously forgot the readme! I’ll fix that later today and release 0.16.1 with the text changes. Thanks for pointing that out. After trying -configonedrive did it work for you?

  1. Thank you very much. It works! But it turns out to download one file “one :/ / mifolder / RestSharp.dll”. When “one :/ / myfolder” or “one :/ / myfolder /” error: Object reference not set to an instance of an object.

  2. Sorry but no. Subfolders aren’t available *yet* but is on my todo list. Also if a folder has a subfolder in it, it will cause issues. So the copying multiple files depends on the folder in question not having any subfolders in it at all. Annoying, but honestly hadn’t got around to the subfolders yet. Given you’re asking, it’s jumping up my todo list 🙂

  3. Hi, I tried to use the application. Unfortunately I am facing some problems. The error says unknown error generated. I tried to run in debug mode, and it is giving error like invalid prameters, bucket does not exists etc. Two errors for two different version of azcopy. BTW the bucket does exists. Azure connection is perfect. This error happens when I try to use AWS S3. Could you please give me any suggestion on what might gone wrong?

  4. I grabbed the latest zip from github, compiled it, and am attempting to run it. I’m trying to copy from Azure Blob to AWS S3. However, the -i parameter doesn’t seem to work recursively. I’m new to this language and can’t seem to figure out how to modify the .cs so that when azurecopy reads the -i parameter and the -i parameter is the blob (https://myaccount.blob.core.windows.net/mycontainer/myblob) and then will go into any subdirectory it needs to and/or read any file in base url and then do the copy. Otherwise I have to build a separate script that will to the -list command, pipe that out to a file or whatever, and then go line by line (doing additional -list commands for sub directories) doing the copy from blob to s3. Hopefully this makes sense. Basically, the current azurecopy program as compiled from the download from github does not recursively read from azure blob. It does from S3 or my hard drive, but not from blob. Any thoughts would be appreciated

    • Hi

      Turns out I do have a bug there. I’ll try and add a fix later tonight. One thing to keep in mind though, I don’t think it will address your immediate concern. Both S3 and Azure Blob Storage only “simulate” directories, they don’t really exist. The way they do it is by having / in the actual blob name. For example you might have a blob called “mydir1/myfile1”. It’s a single blob and that’s its name. What UI programs will do is look for the / and make it appear as if there is a directory called “mydir1” and inside that is a blob called “myfile1”.

      Currently (well *ahem* once I fix this typo) AzureCopy will have 2 methods of copying multiple blobs.
      One command is were you don’t specify the virtual directory at all, eg.

      azurecopy -i https://myacct.blob.core.windows.net/mycontainer/ -o https://mybucket.s3.amazonaws.com/

      That will copy all files (virtual directories or not) from azure container “mycontainer” to the S3 bucket “mybucket”

      If you just wanted to copy all blobs in a virtual directory called “vdir1” (for example), Then the command

      azurecopy -i https://mycontainer.blob.core.windows.net/mycontainer/vdir1/ -o https://mybucket.amazonaws.com/

      would copy all blobs starting with the prefix “vdir1/” to S3. This would ALSO strip the vdir1/ from the blobname.
      eg. on Azure you might have 3 blobs called vdir1/file1, vdir1/file2 and vdir1/file3

      using the above command you’d end up with 3 files in S3 called file1, file2 and file3.

      Do any of these scenarios match what you’re after?

      I’d try and put out a point release with that 1 char fix later tonight.

      Ken

      • Thanks so much for the update. This is a really powerful tool and I thank you for putting it together. I believe what you explained will capture what I’m after. Although, being new to this type of situation, I can’t really say for sure until I actually test it out. Your first command I believe is what I’m after, copying everything under the root of the blob up to the root of S3. A question I have with that is, will the copy process create the “directory” structure on S3 as it’s copying? Kind of like when you use windows explorer and copy a file folder and all of its contents to another location, it creates all sub folders as well as the files contained in them. Or will I need to replicate the directory structure on S3 first and then perform the copy? I’ll look for the change to bring it down and build it. Again, thank you for your help on this, it really saves a ton of time.

  5. I’ve just released version 1.1.0 (see https://github.com/kpfaulkner/azurecopy/releases/tag/1.1.0 ). I performed a larger change than what you requested, just to make sure it was consistent with Azure and S3. I haven’t checked Onedrive/dropbox yet, hence marking it as a Prerelease.

    Yes, if you perform the command:

    azurecopy -i https://myacct.blob.core.windows.net/mycontainer/ -o https://bucket.s3.amazonaws.com/

    then all the “virtual directory” structure should copy along into S3. I’ve tested a number of scenarios this morning and no issues.

    Please let me know if this does what you’re after.

    Thanks

    Ken

  6. Hi,

    When try to copy file from ASW s3 to Azure blob storage I get the following error:
    Unknown error generated. Please report to Github page https://github.com/kpfaulkner/azurecopy/issues . Can view underlying stacktrace by adding -db flag.
    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
    at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.StartCopyFromBlob(Uri source, AccessCondition sourceAccessCondition, AccessCondition destAccessCondition, BlobRequestOptions options, OperationContext operationContext)
    at azurecopy.AzureBlobCopyHandler.StartCopy(String sourceUrl, String DestinationUrl, DestinationBlobType destBlobType)
    at azurecopycommand.Program.DoNormalCopy()
    at azurecopycommand.Program.Process()
    at azurecopycommand.Program.Main(String[] args)

    I am able to authenticate to both cloud services successfully and list files at both locations.

    Please help!

    Thanks

    • Hmmm I’d expect the Blob to be 0 bytes for quite a while during the copying process.
      Couple of questions,

      1) Did you use the blobcopy flag?
      2) Does the command you’re trying work fine with a smaller file?
      3) Did any errors get displayed/generated during the copy?

      Thanks

      Ken

  7. This does not seem to work if you supply a S3 URL that belongs to a AWS S3 compatible storage repostiory. In other words, are we restricted to to use the amazon s3 dns as our S3 URL’s? I have not checked yet, but I assume the same goes for Azure URL’s, if you supply a development URL.

    • I honestly didn’t realise there were other systems that were compatible with the S3 protocol. Azurecopy does work with the Azure storage emulator (because I knew about it).

      Do you have any links to the S3 compatible system you’re talking about?

      Thanks

      Ken

  8. There are various systems that will work with the S3 API.
    For example, the article below describes building hybrid object storage solutions and few examples of such:
    http://community.netapp.com/t5/Tech-OnTap-Articles/StorageGRID-Webscale-Nonstop-Object-Storage-for-Enterprise-and-Cloud/ta-p/90141

    I think the main thing to fix AzureCopy is to allow for more flexible URLs and not coupling to the AWS domain. There are a couple other areas to improve, perhaps we can take this offline as I am making some of these changes myself. You should have access to my email. I was hoping to submit to your GitHub at a later point, after review.

  9. Hi I am trying to use azurecopy tool. version 0.16.2.0
    I want to migrate some files from S3 storage to Azure blob storage.
    I have set the Access ID and Secret Key Id and other values as specified in documents
    to copy from one cloud provider to another I m typing the following command
    azurecopy –i https://tdtsample.s3.amazonaws.com/ –o https://awsdatastorage2.blob.core.windows.net/awscontainer

    but m hitting following issue
    Unknown error generated. Please report to Github page https://github.com/kpfaulkner/azurecopy/issues . Can view underlying stacktrace by adding -db flag.
    at System.Uri.CreateThis(String uri, Boolean dontEscape, UriKind uriKind)
    at System.Uri..ctor(String uriString)
    at Amazon.Runtime.ClientConfig.GetUrl(RegionEndpoint regionEndpoint, String regionEndpointServiceName, Boolean useHttp)
    at Amazon.Runtime.ClientConfig.DetermineServiceURL()
    at Amazon.Runtime.Internal.EndpointResolver.DetermineEndpoint(ClientConfig config, IRequest request)
    at Amazon.S3.Internal.AmazonS3PostMarshallHandler.ProcessRequestHandlers(IExecutionContext executionContext)
    at Amazon.S3.Internal.AmazonS3PostMarshallHandler.PreInvoke(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.GenericHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.PipelineHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.GenericHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.PipelineHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.GenericHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.PipelineHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.GenericHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.PipelineHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.GenericExceptionHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.PipelineHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.GenericExceptionHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.PipelineHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.MetricsHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.RuntimePipeline.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.AmazonServiceClient.Invoke[TRequest,TResponse](TRequest request, IMarshaller`2 marshaller, ResponseUnmarshaller unmarshaller)
    at Amazon.S3.AmazonS3Client.GetBucketLocation(GetBucketLocationRequest request)
    at Amazon.S3.AmazonS3Client.GetBucketLocation(String bucketName)
    at azurecopy.Utils.S3Helper.GenerateS3Client(String accessKey, String secretKey, String bucketName)
    at azurecopy.S3Handler.ListBlobsInContainer(String baseUrl)
    at azurecopycommand.Program.GetSourceBlobList(IBlobHandler inputHandler, String url)
    at azurecopycommand.Program.DoNormalCopy()
    at azurecopycommand.Program.Process()
    at azurecopycommand.Program.Main(String[] args)

    Can anyone help me out with this.??
    i have uploaded a dumy file in this storage to check if its getting migrated but still m stuck 😦

  10. Hi Ken,
    First up, great utility!
    I’m using it to transfer files between S3 and Azure blob store.
    I’ve recently updated to the latest version and I’ve noticed a change in behaviour using the blobcopy. Previously ,if I had a list of files to transfer, each would take 1-2 seconds to ‘queue’ up in Azure and I use some powershell to track to ensure all files complete. So more an asynchronous transfer. However with the latest version, the same transfer scripts result in synchronous behaviour with azurecopy polling and moving on only once the current file has been transferred. Is this a change in default behaviour and therefore is there a corresponding switch to transfer asynchronously?
    A real plus point however is the addition of the CopyID. I was going to request that as it provides me with a key to tally between what’s been added to the transfer queue and what’s succeeded when polling Azure. Particularly useful in the asynchronous scenario.
    thanks
    Steve.

  11. Looks like a bug:
    azurecopy-1.3.1\azurecopycommand\Program.cs
    Line 104: const string MonitorBlobCopyFlag = “-dm”;
    Line 295: case MonitorBlobCopyFlag:
    Line 296: ConfigHelper.MonitorBlobCopy = true;

    Looking at how it’s used in AzureBlobCopyHandler.cs the above line 296 in Program.cs should be
    ConfigHelper.MonitorBlobCopy = false;

    As it currently stands it’s impossible to switch off. 🙂
    For now we’ll work with 1.3.0 until this is fixed but I’m keen to use this version for the returned “BlobCopy ID”.

    Thanks
    Steve.

  12. Yes, of course
    First i run this commands:
    cd C:\Program fies\azurecopy\
    set AzureAccountKey=MyAzureKey
    set AWSAccessKeyID=MyAwsKeys
    set AWSSecretAccessKeyID=MyAWSSecret
    set AWSRegion value=us-east-1
    Then i run this command:
    azurecopy -i https://my-bucketnameinaws.s3.amazonaws.com/ -o https://amazons3bak.blob.core.windows.net/mybucketnameinazure -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID% -blobcopy -destblobtype block

    I also thought that the problem with the keys. I check them several times. Azure is possible to peregenirirovat key.
    By the way, if you enter commands:
    azurecopy -list https://mybucket.s3-us-west-2.amazonaws.com/ -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID%
    and
    azurecopy -list https://mystorage.blob.core.windows.net/mycontainer -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID%
    I can see all of my data in buckets
    But have error, when copy.
    Just recently I set up in Amazon s3 “Bucket Lifecycle” to move all data older 14 days to s3 glacier.
    Maybe because of this their error ?

    • Hi Taras,

      Thanks, ok that helps clarify a few things. To try and eliminate some issues, can we try a few things?

      Firstly, can we try copying a specific blob instead of all the blobs in the bucket? For this the URL for the -i parameter should be a complete URL to a specific blob.
      Also, if the blob isn’t big, can you remove the “-blobcopy” flag? I just want to eliminate problems. Also the -destblobtype block isn’t really needed (its the default).

      I’m unsure about Glacier, I’ve never used it myself. You’re right, maybe that’s causing an issue. If the above steps also fail (which would be really strange, since I have lots of people using Azurecopy for the exact same scenario S3 -> Azure) I’ll have to investigate if Glacier is breaking things.

      Please let me know how the single blob copy goes and we’ll figure out where to go from there.

      Thanks

      Ken

  13. I Create a new bucket in Amazon S3 without Lifecycle rules, Upload some files here and try to copy it to Azure, and it works! Thank you!
    Problem in the rule, that move data to s3.
    Now i need to turn off Lifecycle for my buckets, or recreate buckets

    Thank you!

  14. I am trying to copy files from S3 to Azure BLOB, it copies a few files and after it generates this error. The stack trace is as given below
    GetHandler start
    GetHandler retrieved azurecopy.S3Handler
    GetHandler start
    GetHandler retrieved azurecopy.AzureHandler
    Unknown error generated. Please report to Github page https://github.com/kpfaulk
    ner/azurecopy/issues . Can view underlying stacktrace by adding -db flag.
    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTC
    ommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
    at Microsoft.WindowsAzure.Storage.Blob.CloudBlockBlob.StartCopyFromBlob(Uri s
    ource, AccessCondition sourceAccessCondition, AccessCondition destAccessConditio
    n, BlobRequestOptions options, OperationContext operationContext)
    at azurecopy.AzureBlobCopyHandler.StartCopy(BasicBlobContainer origBlob, Stri
    ng DestinationUrl, DestinationBlobType destBlobType)
    at azurecopycommand.Program.DoNormalCopy()
    at azurecopycommand.Program.Process(Boolean debugMode)
    at azurecopycommand.Program.Main(String[] args)

    Any suggestions here …

  15. AzureCopy looks perfect to me – we will need to copy blobs across from s3 to azure. I would prefer to use the Nuget package with a webjob if possible however I noticed a lack of documentation or examples. It would be useful to have a few examples of the main use cases. i.e. how to copy a blob from S3 to Azure in C# would be a good starting point.

    • Hi

      Admittedly I haven’t kept the nuget packages up to date lately (by far most users are after the command line tool which they get from github). I’ll update the Nuget package in the next few days.
      As for documentation, I think I have a few blog posts about it (but will need to check if they need updating). I’ll reply here once updated.

      Ken

    • I’m a huge fan of Azure Functions. I use Webjobs at work for easy quick execution of tasks based on queues. I’m definitely hoping to switch those to AzureFunctions if the rest of the team agree.

  16. Hi – I’m using version 1.4.1 and trying to copy a file from onedrive. I setup the oauth code as described and see it in the config file. However using the command below returns 400 error.

    D:\Utils\azurecopy-1.4.1\azurecopy>azurecopy.exe -i one://Files/PET/tii_data_test.bacpac -o d:\test -db
    GetHandler start
    Unknown error generated. Please report to Github page https://github.com/kpfaulkner/azurecopy/issues . Can view underly
    ing stacktrace by adding -db flag. System.Net.WebException: The remote server returned an error: (400) Bad Request.
    at System.Net.HttpWebRequest.GetResponse()
    at azurecopy.Helpers.SkyDriveHelper.RefreshAccessToken()
    at azurecopy.Helpers.SkyDriveHelper.GetAccessToken()
    at azurecopy.SkyDriveHandler..ctor(String url)
    at azurecopycommand.Program.GetHandler(UrlType urlType, String url)
    at azurecopycommand.Program.DoNormalCopy(Boolean debugMode)
    at azurecopycommand.Program.Main(String[] args)
    at System.Net.HttpWebRequest.GetResponse()
    at azurecopy.Helpers.SkyDriveHelper.RefreshAccessToken()
    at azurecopy.Helpers.SkyDriveHelper.GetAccessToken()
    at azurecopy.SkyDriveHandler..ctor(String url)
    at azurecopycommand.Program.GetHandler(UrlType urlType, String url)
    at azurecopycommand.Program.DoNormalCopy(Boolean debugMode)
    at azurecopycommand.Program.Main(String[] args)

  17. Hi

    Can you try the following (just to rule out a few things):

    – Make sure the SkyDriveCode and SkyDriveRefreshToken settings in the azurecopy.exe.config file are empty (ie just “”).
    – run the command: azurecopy.exe -configonedrive (and follow the instructions)

    Please let me know the results of that (will help).

    Thanks

    Ken

  18. Hi I’m now trying to list AWS bucket. However despite the fact I can see the bucket fine in Cloudberry I’m not able to -list using AzureCopy. I also tried copy without success. I am able to list azure container fine and my AWS keys work in Cloudberry.

    I use the command:
    azurecopy.exe -list https://transportplanning.tii.ie.s3-eu-west-1.amazonaws.com/ -azurekey %AzureAccountKey% -s3k %AWSAccessKeyID% -s3sk %AWSSecretAccessKeyID% -db

    GetHandler start
    GetHandler retrieved azurecopy.S3Handler
    container name transportplanning.tii.ie
    Unknown error generated. Please report to Github page https://github.com/kpfaulkner/azurecopy/issues . Can view underlying stacktrace by adding -db f
    lag. Amazon.S3.AmazonS3Exception: The request signature we calculated does not match the signature you provided. Check your key and signing method. —
    -> Amazon.Runtime.Internal.HttpErrorResponseException: The remote server returned an error: (403) Forbidden. —> System.Net.WebException: The remote
    server returned an error: (403) Forbidden.
    at System.Net.HttpWebRequest.GetResponse()
    at Amazon.Runtime.Internal.HttpRequest.GetResponse()
    — End of inner exception stack trace —
    at Amazon.Runtime.Internal.HttpRequest.GetResponse()
    at Amazon.Runtime.Internal.HttpHandler`1.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.RedirectHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.Unmarshaller.InvokeSync(IExecutionContext executionContext)
    at Amazon.S3.Internal.AmazonS3ResponseHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.ErrorHandler.InvokeSync(IExecutionContext executionContext)
    — End of inner exception stack trace —
    at Amazon.Runtime.Internal.HttpErrorResponseExceptionHandler.HandleException(IExecutionContext executionContext, HttpErrorResponseException excepti
    on)
    at Amazon.Runtime.Internal.ErrorHandler.ProcessException(IExecutionContext executionContext, Exception exception)
    at Amazon.Runtime.Internal.ErrorHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.CallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.RetryHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.CallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.CallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.S3.Internal.AmazonS3ExceptionHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.ErrorCallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.MetricsHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.RuntimePipeline.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.AmazonServiceClient.Invoke[TRequest,TResponse](TRequest request, IMarshaller`2 marshaller, ResponseUnmarshaller unmarshaller)
    at Amazon.S3.AmazonS3Client.GetBucketLocation(GetBucketLocationRequest request)
    at azurecopy.Utils.S3Helper.GenerateS3Client(String accessKey, String secretKey, String bucketName)
    at azurecopy.S3Handler.d__25.MoveNext()
    at azurecopycommand.Program.DoList(Boolean debugMode)
    at azurecopycommand.Program.Main(String[] args)
    at Amazon.Runtime.Internal.HttpErrorResponseExceptionHandler.HandleException(IExecutionContext executionContext, HttpErrorResponseException excepti
    on)
    at Amazon.Runtime.Internal.ErrorHandler.ProcessException(IExecutionContext executionContext, Exception exception)
    at Amazon.Runtime.Internal.ErrorHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.CallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.RetryHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.CallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.CallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.S3.Internal.AmazonS3ExceptionHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.ErrorCallbackHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.MetricsHandler.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.Internal.RuntimePipeline.InvokeSync(IExecutionContext executionContext)
    at Amazon.Runtime.AmazonServiceClient.Invoke[TRequest,TResponse](TRequest request, IMarshaller`2 marshaller, ResponseUnmarshaller unmarshaller)
    at Amazon.S3.AmazonS3Client.GetBucketLocation(GetBucketLocationRequest request)
    at azurecopy.Utils.S3Helper.GenerateS3Client(String accessKey, String secretKey, String bucketName)
    at azurecopy.S3Handler.d__25.MoveNext()
    at azurecopycommand.Program.DoList(Boolean debugMode)
    at azurecopycommand.Program.Main(String[] args)

    • Hi

      I *think* I see the problem with that. Can you edit the azurecopy.exe.config and add the line:

      (or modify the existing AWSRegion entry).

      If that fixes the problem, I’ll modify AzureCopy so the S3 region should be added as a command line parameter (which I need to do anyway).

      Thanks

      Ken

  19. Hi Ken – I had already set this and all the other keys in the config. I was trying the command line because it didn’t work.. Here is the entry for region from my config file. I noticed that there isn’t a parameter for region.

  20. Hi Ken- I got it to work by setting ALL the entries in the config file relating to AWS – I mean setting the src and trg ones as well all to the same account. Thanks for your help!

  21. Hi again Ken,

    It seems I spoke too soon – most things are working now accept I cant get the -blobcopy option to work. It seems that Azure is denying access in this case. If I remove -blobcopy then it works fine. Also another gotcha was that I had to use a URL format where the bucket follows the base AWS URL to get it to work… Unfortunately its the -blobcopy that I need to use – I need to transfer a massive amount of data.

    D:\Utils\azurecopy-1.4.1\azurecopy>azurecopy.exe -i “https://s3-eu-west-1.amazonaws.com/transportplanning.tii.ie/5_3 National Road Indices Update/Samp
    le PVR/columnNames.csv” -o https://tiidata.blob.core.windows.net/rawcounts -blobcopy -db -v
    GetHandler start
    GetHandler retrieved azurecopy.S3Handler
    GetHandler start
    GetHandler retrieved azurecopy.AzureHandler
    Copy blob 5_3 National Road Indices Update/Sample PVR/columnNames.csv
    Unable to start copying 5_3 National Road Indices Update/Sample PVR/columnNames.csv
    Exception Microsoft.WindowsAzure.Storage.StorageException: The remote server returned an error: (403) Forbidden. —> System.Net.WebException: The rem
    ote server returned an error: (403) Forbidden.
    at System.Net.HttpWebRequest.GetResponse()
    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
    — End of inner exception stack trace —
    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
    at azurecopy.AzureBlobCopyHandler.StartCopy(BasicBlobContainer origBlob, String DestinationUrl, DestinationBlobType destBlobType, Boolean skipIfExi
    sts)
    at azurecopy.AzureBlobCopyHandler.StartCopyList(IEnumerable`1 origBlobList, String destinationUrl, DestinationBlobType destBlobType, Boolean debugM
    ode, Boolean skipIfExists)
    Request Information
    RequestID:08a4ba30-0001-0011-0136-d5f4ba000000
    RequestDate:Thu, 25 May 2017 09:07:03 GMT
    StatusMessage:Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
    ErrorCode:AuthenticationFailed

    New Batch

    Failed:
    Aborted:
    Pending:
    Copy complete
    Operation took 1433 ms

    Successful – normal copy…

    D:\Utils\azurecopy-1.4.1\azurecopy>copyfromaws5.bat

    D:\Utils\azurecopy-1.4.1\azurecopy>azurecopy.exe -i “https://s3-eu-west-1.amazonaws.com/transportplanning.tii.ie/5_3 National Road Indices Update/Samp
    le PVR/columnNames.csv” -o https://tiidata.blob.core.windows.net/rawcounts -db -v
    GetHandler start
    GetHandler retrieved azurecopy.S3Handler
    GetHandler start
    GetHandler retrieved azurecopy.AzureHandler
    Copying blob to https://tiidata.blob.core.windows.net/rawcounts
    Operation took 1573 ms

    • Hi,

      Ok, firstly can you try with a trailing slash on the Azure URL. ie https://…./rawcounts/

      The only “odd” thing I can see that is remotely different to how it’s more commonly used is the URL. Normally people have used s3.amazonaws.com and set the region in the config file. Can you also try that? Otherwise I’ll setup a similar account in a different zone and will try to replicate your exact issue.

      (sorry for the delay).

      Ken

  22. Hi Ken,

    I tried all the suggestions without much success. I don’t see much difference in the output of the special build below. I switched to using s3.amazonaws.com and setting the region in the file and this works fine for the normal copy without the -blobcopy switch. Do you really think that the error is due to s3 and not Azure? The exception “Microsoft.WindowsAzure.Storage.StorageException: The remote server returned an error: (403) Forbidden” suggested to me that Azure might be the problem.

    Another unusual feature of the blobs I’m trying to copy is the spaces in the folder names – could this be the problem? Unfortunately this is outside of my control – its a 3rd party s3 account I’m trying to copy to get road/highway data for roads in Ireland for analysis.

    I can also send you the (successful) results of a -list operation on the s3 bucket if I can put a text somewhere…

    d:\Utils\azurecopy-1.4.2-pre\azurecopy>azurecopy.exe -i “https://s3.amazonaws.com/transportplanning.tii.ie/5_3 National Road Indices Update/Sample PVR
    /columnNames.csv” -o https://tiidata.blob.core.windows.net/rawcounts/ -blobcopy -db -v
    GetHandler start
    GetHandler retrieved azurecopy.S3Handler
    GetHandler start
    GetHandler retrieved azurecopy.AzureHandler
    Copy blob 5_3 National Road Indices Update/Sample PVR/columnNames.csv
    Unable to start copying 5_3 National Road Indices Update/Sample PVR/columnNames.csv
    Exception Microsoft.WindowsAzure.Storage.StorageException: The remote server returned an error: (403) Forbidden. —> System.Net.WebException: The rem
    ote server returned an error: (403) Forbidden.
    at System.Net.HttpWebRequest.GetResponse()
    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
    — End of inner exception stack trace —
    at Microsoft.WindowsAzure.Storage.Core.Executor.Executor.ExecuteSync[T](RESTCommand`1 cmd, IRetryPolicy policy, OperationContext operationContext)
    at azurecopy.AzureBlobCopyHandler.StartCopy(BasicBlobContainer origBlob, String DestinationUrl, DestinationBlobType destBlobType, Boolean skipIfExi
    sts)
    at azurecopy.AzureBlobCopyHandler.StartCopyList(IEnumerable`1 origBlobList, String destinationUrl, DestinationBlobType destBlobType, Boolean debugM
    ode, Boolean skipIfExists)
    Request Information
    RequestID:c8f9e625-0001-0001-09f7-d9c25c000000
    RequestDate:Wed, 31 May 2017 10:23:37 GMT
    StatusMessage:Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature.
    ErrorCode:AuthenticationFailed

    New Batch

    Failed:
    Aborted:
    Pending:
    Copy complete
    Operation took 2666 ms

    d:\Utils\azurecopy-1.4.2-pre\azurecopy>

  23. Another think occurred to me. Do I need to configure anything in the azure blob container or the azure storage account to allow the blobcopy api to work its magic? I didn’t do anything special as yet – its a standard account and the container only allows private access (the default).

  24. Hi Ken – I cracked open the source code – I think there is an error in ConfigHelper.ReadConfig() line 133 TargetAzureAccountKey = GetConfigValue(“TargetAWSAccessKeyID”, string.Empty);
    It should be:
    TargetAzureAccountKey = GetConfigValue(“TargetAzureAccountKey”, string.Empty);
    works fine then – thanks.
    I guess this will only be a problem if you set the target key in config like I have.

  25. Hi Ken, Evaluating the Azurecopy tool and have a couple of questions. If we want to set this up to perform a regular copy process does Azurecopy copy the entire S3 contents or can it be configured to copy only specific files or only recently added files?

  26. Hi Ken,
    I’m getting this error message. Any help would be appreciated. I’m using version 1.5.1.0. I’ve tested my AWS access keys with another application and it works ok.

    D:\Downloads\azurecopy>azurecopy.exe -list https://xxxx.s3.amazonaws.com
    Unknown error generated. Please report to Github page https://github.com/kpfaulkner/azurecopy/issues . Can view underlying stacktrace by adding -db flag. System.Collections.Generic.KeyNotFoundException: The given key was not present in the dictionary.
    at System.ThrowHelper.ThrowKeyNotFoundException()
    at System.Collections.Generic.Dictionary`2.get_Item(TKey key)
    at azurecopy.Utils.S3Helper.GenerateS3Client(String accessKey, String secretKey, String bucketName)
    at azurecopy.S3Handler.d__25.MoveNext()
    at azurecopycommand.Program.DoList(Boolean debugMode)
    at azurecopycommand.Program.Main(String[] args)

Leave a comment