Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Question
Wednesday, March 13, 2019 1:55 AM
We are seeing below error when deploying a packages using Azure Batch Explorer.
It was working until we enabled firewall on the Azure Batch storage account.
Any clue What’s happening here?
AutoStorageKeysInvalid
The auto storage account keys are invalid, please sync auto storage keys. RequestId:1adc3e75-011d-4eb0-8209-239d30157e5e Time:2019-03-13T01:34:52.6974191Z
All replies (6)
Wednesday, March 13, 2019 6:28 AM
Hi,
Right click on the profile and then click on logs.
Post the error here.
Aslo let me know the type of auth which you are using for the storage account(stirage account credentials or Key vault managed storage account).
Wednesday, March 13, 2019 12:27 PM
HI,
I am thinking of two possible fixes based on your error message and your firewall setting.
Probable fix - 1:
Batch accounts caches the storage account keys. Make sure both have the same keys. Also you can manually sync them from batch account in Portal. We can do that by following the below steps.
- Go to the batch account in Azure portal
- Select "storage account" in the left pane. It should list the storage account details (including keys). It will also have a "sync keys" button on the top.
- Then in the another tab open azure portal and go to the storage account.
- Select "access keys" on the left pane. It will list two keys for that storage account.
- Compare those keys with the storage account keys in the batch account. Both should be same.
- If they are not same, Click on "sync now" in batch account UI.
Probable fix -2 :
Since the error started occurring after enabling the firewall in the storage account,
Make sure your ip (Public ip) is allowed in the firewall settings As per the below screenshot
Wednesday, March 20, 2019 5:16 AM
Root cause is found for this issue.
After enabling the firewall in the storage account, We allowed the public ip(of the network where our laptop is connected with). We are able to browse the files through storage explorer.
When we are uploading a package through batch explorer, First the batch explorer send an api call to the batch service which should check the storage account for that package version and reply.
Since only our public ip is added to the storage account, That call fails. Batch account frontend service ip is not added in the storage accounts firewall.
Current the batch service fronted ips are not constant. So we cant add those ips in the storage account firewall.
One work around is to place the storage account in a different region(lets say Region A) other than the batch account(lets say Region B). In that case we can have the list of ips used by the batch service in that region B and add that to firewall in storage account. This will work for small appllication packages. But it will take more time for large packages.
Other option is to use the Azure file share.
Friday, March 22, 2019 7:57 PM
Any update on this issue? If the answer helped you resolve the problem remember to mark as answer so others in the community can easily find the solution
Thursday, June 13, 2019 12:52 PM
Please provide an example of using an Azure file share to resolve this package upload issue.
Our corporate Azure account has a Premium File storage (on Solid State Drives SSD) - can that be used as the "Azure file share" in this situation? Please provide more detail about that as well.
Monday, June 17, 2019 12:44 PM
Hi dlypka,
Please go through the hyperlink for that "Azure file share". That has information about the commands to use to mount that file share on windows and linux nodes.
You can mount them on the compute nodes using start tasks.
In this case packaging and uploading the application to the azure file share should be done by us. We can use the storage explorer to upload new files to the file shares. Once uploaded those files will be available in the compute nodes.
i also doubt that application versioning will work. Need to verify it by deploying.
But this method will work well if we need to distribute the executable to all the nodes.
Later in your task you can use the path for the new files which are uploaded. You can have a latest path in the file share where the latest binary is places. Again we need to check how it works with versioning.