Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Question
Monday, November 19, 2018 5:54 PM
On an upload of 900,000+ files to a blob container, I sometimes get this error. If I rerun it may work fine the next time.
When I upload or download under 1000 files there has never been an error.
Using azure-cli.x86_64 2.0.50-1.el7 @azure-cli on centos
70769 Done, 0 Failed, 689231 Pending, 0 Skipped, 760000 Total (scanning...), 2-sec Throughput (Mb/s): 1168.5787
71109 Done, 0 Failed, 708891 Pending, 0 Skipped, 780000 Total (scanning...), 2-sec Throughput (Mb/s): 267.9121
71148 Done, 0 Failed, 708852 Pending, 0 Skipped, 780000 Total (scanning...), 2-sec Throughput (Mb/s): 2418.4927
71150 Done, 0 Failed, 708850 Pending, 0 Skipped, 780000 Total (scanning...), 2-sec Throughput (Mb/s): 1224.8927
71150 Done, 0 Failed, 708850 Pending, 0 Skipped, 780000 Total (scanning...), 2-sec Throughput (Mb/s): 1540.7855
71412 Done, 0 Failed, 718588 Pending, 0 Skipped, 790000 Total (scanning...), 2-sec Throughput (Mb/s): 1568.3172
Accessing /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2612_vm failed with error lstat /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2612_vm: no such file or directory
71412 Done, 0 Failed, 718588 Pending, 0 Skipped, 790000 Total (scanning...), 2-sec Throughput (Mb/s): 1568.3172
Accessing /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2613 failed with error lstat /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2613: no such file or directory
71412 Done, 0 Failed, 718588 Pending, 0 Skipped, 790000 Total (scanning...), 2-sec Throughput (Mb/s): 1568.3172
Accessing /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2613_vm failed with error lstat /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2613_vm: no such file or directory
71412 Done, 0 Failed, 718588 Pending, 0 Skipped, 790000 Total (scanning...), 2-sec Throughput (Mb/s): 1568.3172
Accessing /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2615 failed with error lstat /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2615: no such file or directory
71412 Done, 0 Failed, 718588 Pending, 0 Skipped, 790000 Total (scanning...), 2-sec Throughput (Mb/s): 1568.3172
Accessing /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2615_fsm failed with error lstat /mnt/data/client-db-primary-dev/base/20181118T230509/data/base/6534168/2615_fsm: no such file or directory
All replies (4)
Wednesday, November 21, 2018 9:47 AM
Hi Robert,
Can you share the document link that you are following?
Thanks,
Wednesday, November 21, 2018 2:48 PM
No, it belongs to a client and I do not have permission to publicly display it.
Friday, November 23, 2018 12:03 PM
Sure, I understand. What is the total size of all the files combined? Also, can you paste the azcopy command with the parameters that you are using.
Here is a link that lists Azure Files scalability and performance targets. This doc indicates the boundaries of our testing and which targets are actually hard limits.
Thursday, November 29, 2018 2:31 PM
Sorry for not responding. I had to drop this for awhile but will be working on this again next week.
azcopy copy /mnt/data/${BARMAN_SERVER}/base/ \
https://primary.blob.core.windows.net/${container_base_name} \
--recursive=true \
--log-level=error \
--overwrite=false
Backup file size is 187GB
-bash-4.2$ du -h -d1
1018M ./log
4.0K ./pg_tblspc
4.0K ./pg_twophase
12K ./pg_notify
1.8M ./pg_stat_tmp
288K ./pg_subtrans
180G ./base
180K ./pg_logical
4.0K ./pg_snapshots
5.4G ./pg_wal
185M ./global
6.2M ./pg_multixact
4.0K ./pg_serial
8.0M ./pg_xact
64K ./pg_stat
4.0K ./pg_dynshmem
36K ./pg_replslot
4.0K ./pg_commit_ts
187G .
Total file and directories count
-bash-4.2$ find . -type f | wc -l
930045
-bash-4.2$ find . -type d | wc -l
1524