Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Replace required module "Azure.Storage" with "Az.Storage" ? #361

Open
devax opened this issue Jan 9, 2020 · 42 comments
Open

Replace required module "Azure.Storage" with "Az.Storage" ? #361

devax opened this issue Jan 9, 2020 · 42 comments

Comments

@devax
Copy link

devax commented Jan 9, 2020

Hi good d365fo.tools people! :)
The module requirement "Azure.Storage" is the one line that keeps giving me a headache in this otherwise awesome Powershell module. Azure.Storage itself has been superseded by Az.Storage, from my understanding. I was hoping that in the required modules section the line
, @{ ModuleName = 'Azure.Storage'; ModuleVersion = '4.4.0' }
could be replaced with this line:
, @{ ModuleName = 'Az.Storage'; ModuleVersion = '1.8.0' }, without things to break, ideally ...
The problem we're facing is that Azure.Storage itself relies on AzureRm.Profile. So whenever the module d365fo.tools is loaded, AzureRm.Profile will get loaded as well. This will then make it impossible in the same Powershell session to load Az.Profile (which is the sucessor of AzureRm.Profile).
As our own Powershell tools have been migrated from AzureRM module to the Az module, we're being in constant conflict with the d365fo.tools module requirements.
I would appreciate if you could have a look at this issue and consider if the module requirement could either be swapped out or removed, so that people like us could choose to use Az.Storage/Az.Profile instead of Azure.Storage/AzureRm.Profile modules.
Thanks a lot.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 9, 2020

I hear you loud and clear.

I would like you to sign up for helping me out with some testing then.

We actually only use the module to make sure some assemblies are in place for us to consume, when working with the Azure blob storage stuff.

@devax
Copy link
Author

devax commented Jan 12, 2020

I am sure willing to help.

Are you talking about some manual testing? If so, are there any specific test cases than you want me to run through? Or any specific commandlets I could manually test using the other module? Just let me know how to assist.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 12, 2020

We just need more eyes to test all scenarios for the following cmdlets / functions:

Get-D365AzureStorageFile
Invoke-D365AzureStorageDownload
Invoke-D365AzureStorageUpload

I just created a new branch, where I believe I have fixed it. I'll make it pass our unit tests. Then I hope you can fork into your own github repo, get it on to your on machine and do some testing.

If it works, I'll merge it and release it straight away.

I did already complete my personal testing and everything seems to work for me.

Please note that if you are finding the progress bar somewhat annoying, $ProgressPreference = "SilentlyContinue" is something you want to configure before running the different cmdlets.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 12, 2020

And I can see that we might need to do some testing of the:
Invoke-D365LcsUpload

It seems it also is using some internal assemblies and stuff...

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 12, 2020

You should be able to use this PR:

#362

Which is based on my branch:
https://github.com/Splaxi/d365fo.tools/tree/refactor-use-az.storage

@devax
Copy link
Author

devax commented Jan 13, 2020

Thank you for your very fast acting on this. This is highly appreciated.

I have updated all our dev, build and release machines to the version including your pull request and have manually tested the commandlets you've pointed out. I've also let our different build and release pipelines run that use the module. Everything works, nothing is broken in my opinion. I think it is ok to merge the PR.

Nevertheless, I'd like to point out two minor findings during my tests for completeness.

On some machines we had the complete AzureRM powershell packages installed (in addition to the Az ones). On these machines, the following command:
Import-Module d365fo.tools
would succeed but also output the warning:
WARNING: Both Az and AzureRM modules were detected on this machine. Az and AzureRM modules cannot be imported in the same session or used in the same script or runbook. If you are running PowerShell in an environment you control you can use the 'Uninstall-AzureRm' cmdlet to remove all AzureRm modules from your machine. If you are running in Azure Automation, take care that none of your runbooks import both Az and AzureRM modules. More information can be found here: https://aka.ms/azps-migration-guide
This is just a warning and does not mean that things break. Aftern running Uninstall-AzureRm as suggested, the warning was gone.

During my tests I have also noticed that the command Invoke-D365AzureStorageUpload does not set the "Content type" property of the uploaded file. If the file already exists in Azure storage, the already existing Content type value will stay as it is. But if a new file is created, the Content type property will be filled with a value of "application/octet-stream", what might not be ideal in every case. I don't think this is a change in the module, have not tested it, but I'm pretty confident the AzureStorage module shows the very same behavior. Not sure if that is something that we want to consider implementing into the commandlet (e.g. to "guess" the correct type by the file's extension or offering a parameter or even something more sophisticated like looking at the binaries actual content ...). I can open a new issue for that if you want me to.

On a last note, the default "Upload Asset to LCS" task from Azure DevOps release pipeline seems to depend on the module AzureRM. So if you are using custom (on premises) build/release machines with the Azure DevOps Agent installed on it and if you want to use that agent to run the default "Upload to LCS" task from a release pipeline, then you must have the AzureRM module installed on that machine (at least for now). This is possible and releases will succeed, but you will have to live with the warning message that is displayed when importing the d365fo.tools module on these machines.

Sorry for the wall of text, I just wanted to have things documented, also for myself, in case I have to look this up again.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 13, 2020

I appreciate you taking the time to provide solid feedback and to help me out with the testing! 👍

I can totally understand that you want the ability to override the current implementation of the file type. Personally I would like to provide a parameter for it and let the consumer (you guys) figure out what file type you are uploading. So please open a new issue for that and provide as much information you can. An example, or a full blown PR, would be awesome 😉

Did you have time to test the upload to lcs from our tools?

In regards to people mixing our module, with the official and it reporting a warning when imported. I consider that an advanced scenario and people going that route should expect things to take some effort. So I vote that the current implementation is working as intended, until proven otherwise.

@devax
Copy link
Author

devax commented Jan 13, 2020

I've tested something similar to the following and it completed without error and after that the file was present in the project's asset library. Let me know if I should test something else.

Get-D365LcsApiToken -Password $pass -Username $user -ClientId $clientId -LcsApiUri https://lcsapi.lcs.dynamics.com | Set-D365LcsApiConfig
Invoke-D365LcsUpload -FilePath $zipToUpload -FileDescription $desc -ProjectId $projId -FileType 'Data Package'

I agree this pull request is a very good one and also vote for it being merged! ;-)

As for the file type issue, I will create an issue for that and will try to create a PR for it, unless someone else wants to jump in and do that.

@devax
Copy link
Author

devax commented Jan 13, 2020

I just came across the following lines in the Az.Storage module file:

# Minimum version of the PowerShell engine required by this module
PowerShellVersion = '5.1'
# Minimum version of Microsoft .NET Framework required by this module. This prerequisite is valid for the PowerShell Desktop edition only.
DotNetFrameworkVersion = '4.7.2'

This seems to be a higher requirement than what we currently have in our module:

PowerShellVersion = '5.0'

I'm not sure if we would have to update our requirements as well to the newer version and if this would lead to problems for some people.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 14, 2020

For the 5.1 requirement, we should simply test what the latest vhd file from lcs contains, and what a fairly new LCS deployed dev box contains. If it is present, there is no issue with upping the required version 🤞

@devax
Copy link
Author

devax commented Jan 14, 2020

OK, I can set up a fresh local DEV environment based on the VHD provided my MS today, as well as a new LCS DEV Box instance on Azure and check. Will report back later.

@devax
Copy link
Author

devax commented Jan 14, 2020

Both types of VMs come with PowerShell version 5.1.

The VHD comes with .NET 4.7.0, the LCS Azure DevBox comes with .NET 4.6.0.
On both these machines you cannot load the module Az.Storage without updating .NET to at least v4.7.2. You can install the module, but when trying to load it, you'll get an error message that clearly states that .NET 4.7.2 or higher is needed.

This might be a problem. :-(

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 14, 2020

Does all versions of Az.Storage require the 4.7.2? 🤔

@devax
Copy link
Author

devax commented Jan 14, 2020

Yes, since version 0.2.2, released on 24.09.2018.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 14, 2020

That is going to be an issue.

The whole idea is that we support the different development boxes / machines directly. The entry point for the tools has to be working with what ordinary people are going to have running.

So for now, we need to hold our horses and see what options we got....

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 14, 2020

Could you please try and run windows updates, before trying to install the Az.Storage?

I can't remember if 4.7.2 is installed automatically through windows update 🤞

@devax
Copy link
Author

devax commented Jan 15, 2020

Windows update on Windows Server 2016 does not update the .NET framework automatically. Seems to be not so straight forward to have Windows Server 2016 install optional Windows updates.

We could offer a commandlet to handle the .NET 4.7.2 installation for people. Installation of the module does not seem to be an issue. As long as it is not being loaded, user's wont be bothered. It is only when you actually want to use some Az.Storage function that you will get the error about the .NET FW version.

For these cases we could have users run a Install-D365dotNet472 (or similar) commandlet that will make the error go away.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 21, 2020

So I didn't forget about you, but I have been wondering what options we have and what impact each will have.

Could you please share your pipeline design (highlevel) and maybe even share some code, that shows us when you want to mix our tools with your own solution. Are you mixing the different commands / functions between each module in one big script? Could you redesign your execution to run in individual powershell tasks, to get a new session, which isn't polluted with AzureRM.Profile?

I have been playing a bit with disabling the auto load feature. This leaves the responsibility onto me to load all dependency modules before hand, before loading the d365fo module. But it might not help us.

Have you considered updating your own solution to PowerShell 6.0 (Core)? That would make sure that your solution isn't impacted by our requirements, but it will force you to split different steps into separate tasks.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 21, 2020

Okay, I tested a little - and wanted to share it with you. It might not help you at all, but I believe it is worth a shot.

Try the following:

Import-Module Azure.Storage
Get-Module

See the list of modules? Keep this session open, and read on.

Try the following in another session

Import-Module Az.Accounts
Get-Module

See the list of modules?

Now back to the first session:

Remove-Module Azure.Storage
Remove-Module AzureRM.Profile
Import-Module Az.Accounts -Force

See the list of modules?

Now your session should be ready to import the remaining Az modules and work against them. If you are switching between the modules, I believe I have found something interesting...

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 21, 2020

Try the following:

Import-Module Azure.storage
Import-Module Az.Storage

You get an error - right?

But what if loaded all the Az.* modules FIRST?!

Import-Module Az.Storage
Import-Module Azure.storage

It seems that it is only the Az modules that validates, on loading, that the old AzureRM stuff is loaded.

I believe you should try and see how that works in your context. Just for us to learn what options we have...

@devax
Copy link
Author

devax commented Jan 23, 2020

Hey @Splaxi , I appreciate your effort finding a solution for our case.
I played around with your suggestions. It seems like I can do an Import-Module sequence in the follwoing order, which will kind of work in our case:

Import-Module Az.Accounts
Import-Module Az.KeyVault

Import-Module d365fo.tools

Get-AzContext -Verbose
...

This discovery is already a big relief for me. Not ideal, but definitively could be worse! Thank you very much for sharing your ideas!

To give you more context about our situation:
The struggling we have is not very much about automation. I understand in these scenarios workarounds can be found using different strategies (e.g. multiple sessions).
Headache situation arise primarily from interactive sessions.
On our local DEV boxes, we offer the d365fo.tools in conjunction with our own custom module to developers to help them efficiently manage their own environments. From the d365.tools they use things like Start/Stop-Environment, Set-AxAdmin, Get-EnvironmentSettings, Invoke-DBSync, while from our own toolbox they initiate more custom and extensive workflows, like updating local databases from predefined environments, managing their Azure storage accounts, accessing credentials, etc.

I think my current options are:

  1. Create Powershell profiles for all users on all environments to ensure the Az-Modules are loaded first when new sessions are created. This seems the short term workaround route to take.
  2. Exclude d365fo.tools from what we offer on DEV machines and just integrate the tools we actually use from your repo into our own toolbox.
  3. Maintain my own fork of your awesome repository, including this PR.
  4. Somehow convince you that integrating this PR into the repo is the right direction for your tools to take. ;)
  5. Somehow convince you to just remove the line with the required module AzStorage from the psd1 file, as this also seems to solve our problems.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 27, 2020

Thank you for sticking around, even when I'm doing all I can NOT giving you what you need 🤣

I'm in favor of suggestion 1, and you know why that is 😉

Answers to the remaining points:
2
I would hate loosing you and your user base from the module perspective, but I totally understand why that is a consideration. The source code it MIT, so you're more than welcome to simply uptake what you need and get on with your work.
3
In the beginning this will be a doable approach, until we refactor stuff in the different parts of the Azure Storage functions, then it will become to much work comparing it to the effort you have to put into things.
4
You already know that this will be hard, which is why we have this very long conversation running already. The reasons behind me being hesitant is that I want to support as many people and users as possible, and not introduce to many things that they need to have in place, before being able to use the 100% of the tools.
5
The sole reason for me to have the Azure.Storage in the psd1 file, is to help users installing all needed modules, without they having to care about the dirty details. Which until now has proven rather power full and I would argue it has helped me grow the user base beyond hardcore techies.

I have some more comments and ideas that I want to share with you, but will post them in a separate answer, to keeps things a bit focused.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 27, 2020

I have been thinking a lot about splitting the module into core functionality, just like AzureRM and the Az modules. This could solve most the issues that we are discussing. This is a raw list of sub-modules suggestions:

d365fo.tools (core)
d365fo.tools.storage / d365fo.tools.blob
d365fo.tools.lcs

The LCS module would allow people to load a not so bloated module to be able to work against the LCS API, which over time will grow considerably. Either you use the tools as a developer or you will be using them as a DevOps engineer, none of them needs anything from the other landscape. Unless you're a hybrid, but then you should be able to understand that you need to import multiple modules.

The blob / storage module would allow people (you), to choose if they want use the tools that we have created, which is based on an older implementation (but supported) for their environments or they have some special needs, just like you presented to me.

I can see that several people are asking questions towards Microsoft whether or not it is supported to install .net 4.7 or higher on the developer machines. I'm hoping that Microsoft in the near future adds support for Visual Studio, that will force people to upgrade their .Net version also, which could help me killing of the Azure.Storage and go for Az.Storage.

If I were to split the module into smaller modules / specialized modules, I would argue that we need to maintain the latest release for minimum 6-12 months, to enable people to plan for the switch. People that are only using the bacpac features and some related functions will not notice any changes, while people using either LCS or Storage will be affected. The last people are also the most capable users of the tools and should be able to plan for the changes, if announced in advance.

I'm looking at how Microsoft solved the installation progress of multiple sub-modules. https://www.powershellgallery.com/packages/Az/3.3.0
They simply have an empty parent module, that only specifies required / dependency modules and it will load this for the user.

Let's say that the d365fo.tools becomes the empty parent module. This will allow the non-technical users to keep installing and updating like they always have. The we will have a d365fo.tools.core, d365fo.tools.lcs and a d365fo.tools.storage. You will then have to update your guides and instruct your users that they need to write Import-Module d365fo.tools.core instead of d365fo.tools.

Every thing is still being processed in my mind and I'm second guessing every single idea that I have about this. Let me know what you think.

@devax
Copy link
Author

devax commented Jan 27, 2020

Warning: Rant against Microsoft ahead. TL;DR at the end of the post.

Microsoft is totally lagging behind with their own tools. VS 2015, .NET 4.6, AzureRM - for all of these tools there are official MS recommendations to upgrade those to current versions. They also give us SqlPS module with all its shortcomings on our DEV VMs, while there is superior SqlServer module around for nearly 3 years now -- a source of pain by its very own! As the D365 “FinOps” developers community we still feel excited that we are now part of the modern SW development world. But in fact, we are still second class citizens, stuck on outdated tools because no one at MS thinks it is important enough to spend the time updating our environments. Forcing everyone who is willing to move forward to lose energy and time solving puzzles that are supposed to be a thing of the past!

In this scenario, the community tools d365fo.tools could make a statement by saying NO to Microsoft. MS, you want to continue disrespecting your own recommendations? Fine. We will not let us hold back by that. Instead, we want to empower users overcoming limitations set by MS!

In that sense, your d365fo.tools model should neither have AzureStorage nor Az.Storage modules referenced, as the first supports MS in its holding us back strategy and the second prevents users from installing your fantastic module. As a consequence, all AzureStorage related functionality should be removed from d365fo.tools. Instead, a new tool should be added to d365fo.tools, with a name like Move-D365ToTheFuture 😉 . This will install all the stuff needed to unlock the advanced, cool features of the d365fo.tools. Funny enough, “all the stuff needed” is not even that much; it is .NET 4.7.2 and the installation is quick and painless. However, once done, the user then can install the cool staff from the second part of the module, like Install-Module d365fo.tools.advanced. d365fo.tools.advanced then has the reference to Az.Storage (and the other cool stuff) and it’s corresponding commandlets. So a user does not care about moving files to/from Azure Storage? No problem at all! Just use d365fo.tools, it contains like 95% of the functionality provided. Oh, you are missing some cool Azure functionality? No problem – just run Move-D365ToTheFuture followed by Install-Module d365fo.tools.advanced and you’re good to go!

From the type of model split that you suggest, I don’t see a real-life benefit. I acknowledge that it would be a nice thing to do from an engineering standpoint. The main benefit would be that you can more easily split responsibilities across modules, as different teams can maintain different modules. But it will also introduce more general complexity. What would be gained for the user? In fact, everyone would just do Install-Module d365fo.tools like they did before, wouldn’t they? Why would someone care about saving potentially 0.5MB of disk space or 10 seconds of installation time by just installing parts of the module instead of everything? Even in my case the module split would not help so much, as there are other ways to work around the problem. As long as your doing this mostly on your own and cannot really share responsibilities, you’re probably best off in just keeping it all together.

Sorry for the wall of text again, I got carried away and might have overstepped the mark a bit. Just wanted to add a different perspective, but in fact, I can live with every situation.

I have implemented my suggestion 1. Seems to work out so far.

TL;DR: New suggestion: d365fo.tools should neither reference AzureStorage nor Az.Storage. Instead, a complimentary module d365fo.tools.advanced should be created that references Az.Storage (and other not officially supported stuff) and contains the dependent cmdlets that will further empower users to move forward with their environments and processes.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 27, 2020

The plan was to move the dependencies into the sub module, so that you could avoid the required loading of the Azure.Storage module 😉

So when you write d365fo.tools.advanced, I write d365fo.tools.storage.

But I hear you loud and clear, and I'm some what on your side.

This entire module is created, because I couldn't believe there were no tools available.

The d365fo.tools.advanced idea is growing on me while I think about it. Wrapping all requirements / prerequisites into a installer cmdlet is a neat way to handle it, and then the user simply imports the d365fo.tools.advanced and is good to go.

... Give me a few days to think about it ...

@devax
Copy link
Author

devax commented Jan 28, 2020

So when you write d365fo.tools.advanced, I write d365fo.tools.storage.

We're not saying the same thing. You're saying you want to move Azure.Storage dependency into a submodule. I'm saying Azure.Storage is a thing of the past and should not be the dependency of anything.

Sooner or later even MS will have to update their (our) environments. You can wait until that happenes and then make the move when they do the move, if you like.

@Splaxi
Copy link
Collaborator

Splaxi commented Jan 28, 2020

You are right about that, but I was just trying to find a compromise 🙊🙈🙉

Which cmdlets / types of area would you say that we should move into the d365fo.tools.advanced? 🤔

@devax
Copy link
Author

devax commented Jan 28, 2020

Looking at your PR #362 , I see affected commandlets are:

  • Get-D365AzureStorageFile
  • Invoke-D365AzureStorageDownload
  • Invoke-D365AzureStorageUpload

So these would be moved to the advanced module. Dependency of Azure.Storage to be replaced with Az.Storage. Requirement of the advanced module would then be .NET 4.7.2, requirement for d365fo.tools would stay unchanged.

Once MS moves to .NET 4.7.2+ on their environments, we can re-integrate the cmdlets from d365fo.tools.advanced into d365fo.tools and update the requirement of d365fo.tools to .NET 4.7.2+, which then will be in line with MS environments again. advanced-module then will not be needed anymore.

To be honest, this sounds like a lot of work for not much gain, as I don't see many people here struggling with the current situation. ;-) I struggled a lot, but you gave me the excellent hint about module loading order, which enables me to keep our own situation under control for now. I understand where you're coming from, keeping things aligned with MS stuff is not a bad strategy. Instead of wasing valuable working hours here, we probably would better add some pressure on MS to upgrade their environments.

@Splaxi
Copy link
Collaborator

Splaxi commented Feb 10, 2020

I'm super thankful for you taking the time to provide me with feedback and pitch ideas against the module.

As you said yourself, you're the very first to run into this issue, or at least the first to share it with me 😉

I've closed the PR and will also close this issue. It can quickly be revived when we need to.

Once more - thank you for sticking around!

@FH-Inway
Copy link
Member

2,5 years later :)
The 10.0.24 VHD has .NET version 4.7.2. The script from https://stackoverflow.com/a/3495491/2720554 listing the .NET versions gives me the following output:

.NET Framework                   Product Version        Release
--------------                   ------- -------        -------
v2.0.50727                               2.0.50727.4927
v3.0                                     3.0.30729.4926
Windows Communication Foundation         3.0.4506.4926
Windows Presentation Foundation          3.0.6920.4902
v3.5                                     3.5.30729.4926
Client                           4.7.2   4.7.03190      461814
Full                             4.7.2   4.7.03190      461814
Client                                   4.0.0.0

As I understand the previous discussion, this was the reason that the switch from Azure.Storage to Az.Storage could not be done. Now, this switch would be possible. Among other benefits, this would also resolve #640.

Can this issue be opened again?

@Splaxi Splaxi reopened this Jun 19, 2022
@Splaxi
Copy link
Collaborator

Splaxi commented Jun 19, 2022

It is open now.

Let's plan what this would mean in terms of work needed, while supporting LCS deployed machines and VHD (downloaded) machines.

@FH-Inway
Copy link
Member

To get us started, I resurrected your PR #362 from 2020 and merged it with the current development branch in a new PR draft #663

I did some testing with that version on a vanilla 10.0.24 VHD and so far, the tests have been successful. I used https://gist.github.com/FH-Inway/9140b17415c20147fef6eaaa633ad96c for testing.
The output from a test run ca be seen here: https://gist.github.com/FH-Inway/7cc97d3c725073577c2ac3e4afd3f860

Let me know what you think. Next step is to test it on a LCS deployed environment.

@FH-Inway
Copy link
Member

Also wanted to mention that New-AzUpgradeModulePlan reports the same 3 cmdlets that were changed by the PR:

Order Location                                   UpgradeType PlanResult     Original                Replacement
----- --------                                   ----------- ----------     --------                -----------
1     get-d365azurestoragefile.ps1:101:27        Cmdlet      ReadyToUpgrade new-AzureStorageContext New-AzStorageCon...
2     get-d365azurestoragefile.ps1:95:27         Cmdlet      ReadyToUpgrade new-AzureStorageContext New-AzStorageCon...
3     invoke-d365azurestoragedownload.ps1:138:35 Cmdlet      ReadyToUpgrade new-AzureStorageContext New-AzStorageCon...
4     invoke-d365azurestoragedownload.ps1:132:35 Cmdlet      ReadyToUpgrade new-AzureStorageContext New-AzStorageCon...
5     invoke-d365azurestorageupload.ps1:127:35   Cmdlet      ReadyToUpgrade new-AzureStorageContext New-AzStorageCon...
6     invoke-d365azurestorageupload.ps1:120:35   Cmdlet      ReadyToUpgrade new-AzureStorageContext New-AzStorageCon...

@Splaxi
Copy link
Collaborator

Splaxi commented Jun 27, 2022

Awesome work!

I'm going to look into it over the next few days and see if I can find any issues. Otherwise our "backup" plan is to have people pull the latest know good version from PowershellGallery, while we fix any issues 🤞😉

@FH-Inway
Copy link
Member

I did another test run on a LCS deployed environment (it was a vanilla environment I deployed yesterday with version 10.0.24 and default configuration). The test script showed the same results as on the vanilla 10.0.24 VHD. I changed the PR to "Ready for review".

@FH-Inway
Copy link
Member

FH-Inway commented Jul 9, 2023

Looking a bit more closely how the Azure.Storage dependency is used. From #663 and New-AzUpgradeModulePlan we know that this impacts Invoke-D365AzureStorageUpload, Invoke-D365AzureStorageDownload and Get-D365AzureStorageFile. Looking at these three cmdlets, the only command from Azure.Storage used is the New-AzureStorageContext command. And from the created context, only the ConnectionString property is being used. So the module is actually using a very small part of the Azure.Storage functionality. I wonder if we could find another way to determine the storage connection string and get rid of this dependency alltogether.

All 3 cmdlets use this logic (which should be moved to an internal function to avoid redundancy):

if ([string]::IsNullOrEmpty($SAS)) {
Write-PSFMessage -Level Verbose -Message "Working against Azure Storage Account with AccessToken"
$storageContext = new-AzureStorageContext -StorageAccountName $AccountId.ToLower() -StorageAccountKey $AccessToken
}
else {
Write-PSFMessage -Level Verbose -Message "Working against Azure Storage Account with SAS"
$conString = $("BlobEndpoint=https://{0}.blob.core.windows.net/;QueueEndpoint=https://{0}.queue.core.windows.net/;FileEndpoint=https://{0}.file.core.windows.net/;TableEndpoint=https://{0}.table.core.windows.net/;SharedAccessSignature={1}" -f $AccountId.ToLower(), $SAS)
$storageContext = new-AzureStorageContext -ConnectionString $conString
}
$cloudStorageAccount = [Microsoft.WindowsAzure.Storage.CloudStorageAccount]::Parse($storageContext.ConnectionString)

@Splaxi
Copy link
Collaborator

Splaxi commented Jul 12, 2023

This should be easy to fix - as my understanding has grown since the first implementation...

@FH-Inway
Copy link
Member

@Splaxi After the merge of #663 I ran my tests on a pristine local VHD environment based on the latest 10.0.32 VHD. Results were the same as my earlier test.

I also tried to test it on a codespace, but there we still have some work to do, see #664

Hope your tests are successful as well. From my side, we are good to go ahead with the new release.

@FH-Inway
Copy link
Member

@Splaxi I just realized that the links to the test script and log of a run of the script were secret gists. Sorry about that, they are now public.

After removing the AzureAd dependency (see #744 ), I was also able to run the tests in a Github codespace. Results were successful.
You can do the same on my test-az-storage branch. Just start a codespace on that branch and run the d365fo.tools.az.storage.test.codespace.ps1 script (you need to provide some values for your Azure storage blob container in the first 3 variables).

@FH-Inway
Copy link
Member

@Splaxi Another idea for reducing the potential impact of the new version: You could publish a prerelease version to the Gallery. This way, the new version could be tested by explicitly installing the prerelease version. Everyone else would still get the current 0.6.79 version.

@FH-Inway
Copy link
Member

@Splaxi Thanks for the 0.7.0 release, I just did a first run of that version in the wild on a Azure DevOps pipeline with a windows-latest agent. Worked great, though you do get the following warning regarding Az and AzureRM modules, because on the agent, they still have AzureRM installed 😄

WARNING: Both Az and AzureRM modules were detected on this machine. Az and AzureRM modules cannot be imported in the same session or used in the same script or runbook. If you are running PowerShell in an environment you control you can
use the 'Uninstall-AzureRm' cmdlet to remove all AzureRm modules from your machine. If you are running in Azure
Automation, take care that none of your runbooks import both Az and AzureRM modules. More information can be found
here: https://aka.ms/azps-migration-guide

@FH-Inway
Copy link
Member

Looking back at the original issue, it turns out that the new version does not seem to resolve it yet. If I understand correctly, @devax wanted to import both the d365fo.tools module as well as the Az.Profile module in the same session.

First of all, Az.Profile is not installed out of the box on the Microsoft VHD for local environments (I still have to check cloud hosted ones). It can be installed, but requires -AllowClobber (seems some Az.* modules have the same commands?).

After that, either d365fo.tools or Az.Profile can be imported in the current session, but not both. Depending on which one gets imported first, different errors are shown. Not sure if one of those is the error that @devax originally encountered or if we are making progress by getting a different error 😄

@devax If you are still with us after all those years and still face this issue, let us know which error you were/are facing.

Error when Az.Profile is imported first, d365fo.tools second:
image

Error when d365fo.tools is imported first, Az.Profile second:
image

Note: In the second case, Az.Profile is shown as imported module by Get-Module. Not sure if it can be used despite the error message.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants