Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Slow Artifact Download #3338

Closed
evjrr opened this issue Feb 14, 2024 · 11 comments
Closed

Slow Artifact Download #3338

evjrr opened this issue Feb 14, 2024 · 11 comments

Comments

@evjrr
Copy link

evjrr commented Feb 14, 2024

Slow Artifact Download it takes about 2 hours to download and my connection is not the problem I have 560 Mbps Download and 940 Mbps Upload when I performed the test.

Normally it takes about 7-9 minutes to build a new container from scratch.

But with this issue all our pipelines fails if it does not have the artifact.cache because of a timeout on 20 min for downloading artifacts.

Scripts used to create container and cause the issue

Clear-Host
Write-Host "Start: $(([Datetime]::Now))"
$Artifact = Get-BCArtifactUrl -version 23 -type Sandbox -country w1 -select Latest
New-BcContainer -accept_eula -artifactUrl $Artifact -isolation process -auth UserPassword -Credential 'Dev' -containerName Test
Write-Host "End: $(([Datetime]::Now))"

Full output of scripts

Start: 02/14/2024 17:45:54
BcContainerHelper version 6.0.5
BC.HelperFunctions emits usage statistics telemetry to Microsoft
BcContainerHelper is version 6.0.5
BcContainerHelper is not running as administrator
UsePsSession is True
Host is Microsoft Windows 11 Enterprise - 10.0.22631.3007
Docker Client Version is 25.0.3
Docker Server Version is 25.0.3
Removing Desktop shortcuts
Downloading artifact /sandbox/23.4.15643.16218/w1
Downloading C:\Users\jrr\AppData\Local\Temp\2194d6ea-60d0-4366-a5f2-c9ea9a4a6d2f.zip
Downloading using HttpClient
Unpacking artifact to tmp folder using 7zip
Downloading platform artifact /sandbox/23.4.15643.16218/platform
Downloading C:\Users\jrr\AppData\Local\Temp\16c0cee4-ae86-4ccc-82f6-9ec00dd40748.zip
Downloading using HttpClient
Unpacking artifact to tmp folder using 7zip
Fetching all docker images
Fetching all docker volumes
Using image mcr.microsoft.com/businesscentral:10.0.20348.2227
Creating Container Test
Style: sandbox
Multitenant: Yes
Version: 23.4.15643.16218
Platform: 23.0.16180.0
Generic Tag: 1.0.2.14
Container OS Version: 10.0.20348.2227 (ltsc2022)
Host OS Version: 10.0.22631.3007 (Unknown/Insider build)
Using process isolation
Using locale en-US
Disabling the standard eventlog dump to container log every 2 seconds (use -dumpEventLog to enable)
Files in C:\ProgramData\BcContainerHelper\Extensions\Test\my:
- AdditionalOutput.ps1
- MainLoop.ps1
- SetupVariables.ps1
- updatecontainerhosts.ps1
Creating container Test from image mcr.microsoft.com/businesscentral:10.0.20348.2227
c485f195c6f522fb9619b493df169f12f618d81c7d02a30d9708f0cd5e9f6314
Waiting for container Test to be ready
Using artifactUrl https://bcartifacts.azureedge.net/sandbox/23.4.15643.16218/w1
Using installer from C:\Run\210-new
Installing Business Central
Installing from artifacts
Starting Local SQL Server
Starting Internet Information Server
Copying Service Tier Files
c:\dl\sandbox\23.4.15643.16218\platform\ServiceTier\Program Files
c:\dl\sandbox\23.4.15643.16218\platform\ServiceTier\System64Folder
Copying PowerShell Scripts
c:\dl\sandbox\23.4.15643.16218\platform\WindowsPowerShellScripts\Cloud\NAVAdministration
c:\dl\sandbox\23.4.15643.16218\platform\WindowsPowerShellScripts\WebSearch
Copying Web Client Files
c:\dl\sandbox\23.4.15643.16218\platform\WebClient\Microsoft Dynamics NAV
Copying ModernDev Files
c:\dl\sandbox\23.4.15643.16218\platform
c:\dl\sandbox\23.4.15643.16218\platform\ModernDev\program files\Microsoft Dynamics NAV
Copying additional files
Copying ConfigurationPackages
C:\dl\sandbox\23.4.15643.16218\platform\ConfigurationPackages
Copying Test Assemblies
C:\dl\sandbox\23.4.15643.16218\platform\Test Assemblies
Copying Extensions
C:\dl\sandbox\23.4.15643.16218\w1\Extensions
Copying Applications
C:\dl\sandbox\23.4.15643.16218\platform\Applications
Copying dependencies
Copying ReportBuilder
Importing PowerShell Modules
Restoring CRONUS Demo Database
Setting CompatibilityLevel for tenant on localhost\SQLEXPRESS
Exporting Application to CRONUS
Removing Application from tenant
Modifying Business Central Service Tier Config File for Docker
Creating Business Central Service Tier
Installing SIP crypto provider: 'C:\Windows\System32\NavSip.dll'
Starting Business Central Service Tier
Importing license file
Copying Database on localhost\SQLEXPRESS from tenant to default
Taking database tenant offline
Copying database files
Attaching files as new Database default
Putting database tenant back online
Mounting tenant database
Mounting Database for default on server localhost\SQLEXPRESS with AllowAppDatabaseWrite = False
Sync'ing Tenant
Tenant is Operational
Stopping Business Central Service Tier
Installation took 388 seconds
Installation complete
Initializing...
Setting host.containerhelper.internal to 172.24.160.1 in container hosts file
Starting Container
Hostname is Test
PublicDnsName is Test
Using NavUserPassword Authentication
Creating Self Signed Certificate
Self Signed Certificate Thumbprint 57A3B87D1A4B88515C4D0F4E62C2D16D5ED17CD7
DNS identity Test
Modifying Service Tier Config File with Instance Specific Settings
Starting Service Tier
Registering event sources
Creating DotNetCore Web Server Instance
Using application pool name: BC
Using default container name: NavWebApplicationContainer
Copy files to WWW root C:\inetpub\wwwroot\BC
Create the application pool BC
Create website: NavWebApplicationContainer without SSL
Update configuration: navsettings.json
Done Configuring Web Client
Enabling Financials User Experience
Dismounting Tenant
Mounting Tenant
Mounting Database for default on server localhost\SQLEXPRESS with AllowAppDatabaseWrite = False
Sync'ing Tenant
Tenant is Operational
Creating http download site
Setting SA Password and enabling SA
Creating Dev as SQL User and add to sysadmin
Creating SUPER user
Container IP Address: 172.24.164.188
Container Hostname  : Test
Container Dns Name  : Test
Web Client          : http://Test/BC/?tenant=default
Dev. Server         : http://Test
Dev. ServerInstance : BC
Dev. Server Tenant  : default
Setting Test-default to 127.0.0.1 in container hosts file

Files:
http://Test:8080/ALLanguage.vsix

Container Total Physical Memory is 31.7Gb
Container Free Physical Memory is 12.0Gb

Initialization took 49 seconds
Ready for connections!

Creating Container user winrm
Reading CustomSettings.config from Test
Creating Desktop Shortcuts for Test
Cleanup old dotnet core assemblies
Container Test successfully created

Use:
Get-BcContainerEventLog -containerName Test to retrieve a snapshot of the event log from the container
Get-BcContainerDebugInfo -containerName Test to get debug information about the container
Enter-BcContainer -containerName Test to open a PowerShell prompt inside the container
Remove-BcContainer -containerName Test to remove the container again
docker logs Test to retrieve information about URL's again
End: 02/14/2024 19:51:29

@freddydk
Copy link
Contributor

So, you are saying that this:

$Artifact = Get-BCArtifactUrl -version 23 -type Sandbox -country w1 -select Latest
Download-Artifacts -artifactUrl $artifact -includePlatform

Takes 1-2 hours???

or is it the container creation?

I can see that you are not running as administrator, yet UsePsSession is true - that is strange, but would not cause any issues with artifact download.

@evjrr
Copy link
Author

evjrr commented Feb 15, 2024

It is the download part there takes 1-2 hours
It's the same if I just download the artifact from a browser

@freddydk
Copy link
Contributor

freddydk commented Feb 15, 2024

Artifacts are not fast and we are investigating ways to make them faster.
Having said that - I can (from my home with a similar internet connection as yours) get the BC artifact url in 7 seconds and download them in 1 minute and 40 seconds.

Measure-Command {
    $Artifact = Get-BCArtifactUrl -version 23 -type Sandbox -country w1 -select Latest
}
Measure-Command {
    Download-Artifacts -artifactUrl $artifact -includePlatform
}



Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 7
Milliseconds      : 120
Ticks             : 71209458
TotalDays         : 8,24183541666667E-05
TotalHours        : 0,0019780405
TotalMinutes      : 0,11868243
TotalSeconds      : 7,1209458
TotalMilliseconds : 7120,9458

Downloading artifact /sandbox/23.4.15643.16235/w1
Downloading C:\Users\freddyk\AppData\Local\Temp\2a98828f-9533-4d24-9265-aa94f2ebc4cd.zip
Downloading using WebClient
Unpacking artifact to tmp folder using 7zip
Downloading platform artifact /sandbox/23.4.15643.16235/platform
Downloading C:\Users\freddyk\AppData\Local\Temp\0ecd86a8-a4b9-41a2-8187-44ccf510729b.zip
Downloading using WebClient
Unpacking artifact to tmp folder using 7zip
Days              : 0
Hours             : 0
Minutes           : 1
Seconds           : 39
Milliseconds      : 530
Ticks             : 995307043
TotalDays         : 0,00115197574421296
TotalHours        : 0,0276474178611111
TotalMinutes      : 1,65884507166667
TotalSeconds      : 99,5307043
TotalMilliseconds : 99530,7043

The problem seems to be local at your place - maybe you have a proxy or a virus scanner that causes this - it isn't something I can reproduce nor something I have seen from other people.

@evjrr
Copy link
Author

evjrr commented Feb 15, 2024

It is running fine now.

but the problem was not only on my work computer and in the office I tested at work and from home and at home I tested it on my private PC and on my work PC same result so it is not a proxy or virus scanner problem I think it is a Azure Blob Storage problem.

It is not the first time I have seen this problem.

@joandrsn
Copy link

I work at the same company as @evjrr and have tried the same command as you just posted @freddydk.

The results are quite a bit slower (~25 minutes to download artifacts). That is bad news for us since we have a timeout of 20 minutes to download an artifact.

In order to test on some other ISP, I had a couple of friends test the same URL (https://bcartifacts.azureedge.net/sandbox/23.4.15643.16247/w1). They reported as follows:

  • ISP: TDC, download time: 9 minutes, Location: Aarhus, Denmark
  • ISP: Kazoom, download time: 14 minutes, Location: Skanderborg, Denmark

Is it possible that the Azure Storage Account bcartifacts could be reaching some throughput limit at peak times?
We do not have experience with large public Azure Storage Accounts, but I think I would personally start at the following article and see if the metrics in Azure Storage reported any issues.
https://learn.microsoft.com/en-us/troubleshoot/azure/azure-storage/troubleshoot-storage-performance

The simple solution for us would be to increase the timeout to 30 minutes or more, but I only see more pipelines/developers coming to Business Central.

PS C:\Users\jsa> Measure-Command {
>>     $Artifact = Get-BCArtifactUrl -version 23 -type Sandbox -country w1 -select Latest
>> }
BcContainerHelper version 6.0.3
BC.HelperFunctions emits usage statistics telemetry to Microsoft


Days              : 0
Hours             : 0
Minutes           : 0
Seconds           : 18
Milliseconds      : 353
Ticks             : 183530401
TotalDays         : 0,000212419445601852
TotalHours        : 0,00509806669444444
TotalMinutes      : 0,305884001666667
TotalSeconds      : 18,3530401
TotalMilliseconds : 18353,0401



PS C:\Users\jsa> Measure-Command {
>>     Download-Artifacts -artifactUrl $artifact -includePlatform
>> }
Downloading artifact /sandbox/23.4.15643.16247/w1
Downloading C:\Users\jsa\AppData\Local\Temp\e3e8a53e-55fb-4d1a-943f-1171576de634.zip
Downloading using WebClient
Unpacking artifact to tmp folder using 7zip
Downloading platform artifact /sandbox/23.4.15643.16247/platform
Downloading C:\Users\jsa\AppData\Local\Temp\9c2f87a8-657a-48a3-b481-3c4d906a0a3c.zip
Downloading using WebClient
Unpacking platform artifact to tmp folder using 7zip
Downloading Prerequisite Components
Downloading c:\bcartifacts.cache\sandbox\23.4.15643.16247\platform\Prerequisite Components\IIS URL Rewrite Module\rewrite_2.0_rtw_x64.msi
Downloading using WebClient
Downloading c:\bcartifacts.cache\sandbox\23.4.15643.16247\platform\Prerequisite Components\DotNetCore\DotNetCore.1.0.4_1.1.1-WindowsHosting.exe
Downloading using WebClient


Days              : 0
Hours             : 0
Minutes           : 24
Seconds           : 43
Milliseconds      : 778
Ticks             : 14837780411
TotalDays         : 0,0171733569571759
TotalHours        : 0,412160566972222
TotalMinutes      : 24,7296340183333
TotalSeconds      : 1483,7780411
TotalMilliseconds : 1483778,0411

@kasperdj
Copy link

I have tested the direct download of weekly sandbox artifacts for DK (https://bcartifacts.azureedge.net/sandbox/23.4.15643.16092/dk) from home with a 1000 mbit connection - I have an extremely poor download speed at 16-24 kb per second which is back to analog modem "speed". Download ETA is 2+ hours!. I know this is not containerhelper issue but something in Entra - please engage relevant people at Microsoft as all our pipelines are currently timing out and therefore failing.

@AskeHolst
Copy link

Same here, I have actions in github taking forever, tried a local download just to check:
image
A part of it downloads and then it grinds to a halt, Chrome gave up finally. Gigabit fiber, so it should not be local.

@kasperdj
Copy link

I think somebody fixed the issue during the night, 7 pipelines executed this morning and everything is now back to normal speed :-)

@RadoArvay
Copy link

Similar thing (but not a such extreme) was discussed here: #3320

@freddydk freddydk closed this as completed Mar 6, 2024
@ronnykwon
Copy link

I don't feel it's fixed, it comes and go... I'm still facing timeout issues using the latest bccontainerhelper... I don't face any on github runner, only on my self hosted runners with a 300Mbs internet speed.

@freddydk
Copy link
Contributor

@ronnykwon please create a new issue with the version of ContainerHelper used and artifact Url + the time spend when timing out

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

7 participants