You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Note: The version is visible when running AzCopy without any argument
Which platform are you using? (ex: Windows, Mac, Linux) MacOS
What command did you run? azcopy copy SAS1 SAS2
Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.
What problem was encountered? `INFO: Failed to create one or more destination container(s). Your transfers may still succeed if the container already exists.
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x2 addr=0x0 pc=0x10162b3a0]
It appears that some of the script isn't escaping properly in the comment. I can assure you that the script is correct. The script uses $_. and "`" to escape characters. It appears that GitHub's "code block" system is pretty buggy.
@siminsavani-msft Do you think this error is related to the issue fixed in this PR? Please take a look. #2635
Hi, yes! It is related to the PR you have tagged @gapra-msft. It appears that @bfox-sugarshot is attempting to transfer a 16 TB disk which causes this panic.
Which version of the AzCopy was used? 10.23.0
Note: The version is visible when running AzCopy without any argument
Which platform are you using? (ex: Windows, Mac, Linux) MacOS
What command did you run? azcopy copy SAS1 SAS2
Note: Please remove the SAS to avoid exposing your credentials. If you cannot remember the exact command, please retrieve it from the beginning of the log file.
What problem was encountered? `INFO: Failed to create one or more destination container(s). Your transfers may still succeed if the container already exists.
INFO: Any empty folders will not be processed, because source and/or destination doesn't have full folder support
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x2 addr=0x0 pc=0x10162b3a0]
goroutine 1 [running]:
github.com/Azure/azure-storage-azcopy/v10/cmd.(*blobTraverser).Traverse(0x1400021a230, 0x0?, 0x0?, {0x10219dae0, 0x0, 0x0})
github.com/Azure/azure-storage-azcopy/v10/cmd/zc_traverser_blob.go:231 +0x570
github.com/Azure/azure-storage-azcopy/v10/cmd.(*CopyEnumerator).enumerate(0x140000fa140)
github.com/Azure/azure-storage-azcopy/v10/cmd/zc_enumerator.go:766 +0x48
github.com/Azure/azure-storage-azcopy/v10/cmd.(*CookedCopyCmdArgs).processCopyJobPartOrders(0x1400010a900)
github.com/Azure/azure-storage-azcopy/v10/cmd/copy.go:1610 +0xb5c
github.com/Azure/azure-storage-azcopy/v10/cmd.(*CookedCopyCmdArgs).process(0x14000252a20?)
github.com/Azure/azure-storage-azcopy/v10/cmd/copy.go:1262 +0x74
github.com/Azure/azure-storage-azcopy/v10/cmd.init.2.func2(0x14000423b80?, {0x140004e80e0?, 0x2?, 0x1016416f6?})
github.com/Azure/azure-storage-azcopy/v10/cmd/copy.go:2014 +0x1b0
github.com/spf13/cobra.(*Command).execute(0x14000423b80, {0x140004e80a0, 0x2, 0x2})
github.com/spf13/cobra@v1.4.0/command.go:860 +0x550
github.com/spf13/cobra.(*Command).ExecuteC(0x102107e80)
github.com/spf13/cobra@v1.4.0/command.go:974 +0x318
github.com/spf13/cobra.(*Command).Execute(...)
github.com/spf13/cobra@v1.4.0/command.go:902
github.com/Azure/azure-storage-azcopy/v10/cmd.Execute({0x140000db4e0?, 0x1020f6290?}, {0x140000db560?, 0x10168951c?}, 0x74?, {0xb87f1db2, 0x9aa3, 0x1346, {0x76, 0x2, ...}})
github.com/Azure/azure-storage-azcopy/v10/cmd/root.go:220 +0x104
main.main()
github.com/Azure/azure-storage-azcopy/v10/main.go:84 +0x3a4`
How can we reproduce the problem in the simplest way? Attempt to copy 16TB disk from managed disk to page blob using SAS
The following is the script I am using:
`[CmdletBinding()]
param (
[Parameter(Mandatory = $TRUE)]
$DiskList
)
$disks = Import-Csv -Path $DiskList
Connect-AzAccount
$subscriptionId = "SUBID"
Select-AzSubscription -SubscriptionId $SubscriptionId
$sasExpiryDuration = "7200"
$storageAccountName = "STORAGENAME"
$storageContainerName = "CONTAINERNAME"
$storageAccountKey = 'STORAGE_KEY'
$disks | ForEach-Object -parallel {
$diskName = $ .Name
$resourceGroupName = $.resourcegroup
$destinationVHDFileName = "$diskName.vhd"
Write-Host "DiskName: $diskName"
$sas = Grant-AzDiskAccess -ResourceGroupName $ResourceGroupName -DiskName $diskName -DurationInSecond $using:sasExpiryDuration -Access Read
$destinationContext = New-AzStorageContext -StorageAccountName $using:storageAccountName -StorageAccountKey $using:storageAccountKey
$containerSASURI = New-AzStorageContainerSASToken -Context $destinationContext -ExpiryTime(get-date).AddSeconds($using:sasExpiryDuration) -FullUri -Name $using:storageContainerName -Permission rw
$containername, $sastokenkey = $containerSASURI -split "?"
$containerSASURI = "$containername/$destinationVHDFileName
?$sastokenkey" azcopy copy $sas.AccessSAS $containerSASURI }
CSV contains two columns, Name and ResourceGroup
This script worked on all other disks except for the 16TB HDD
Have you found a mitigation/solution? No
The text was updated successfully, but these errors were encountered: