locked
load (or autoload) powershell module from central share RRS feed

  • Question

  • hi there,

    in big companies some people are developing powershell modules - and others should be able to use those modules. How can I establish a kind of psh-modules 'central storage location' on the network?

    Yours

    FG Clodt


    fgc

    Monday, June 24, 2013 1:13 PM

Answers

  • Try adding the path to $PSModulePath in the PowerShell Profile.

    Mike

    • Proposed as answer by Malte Brodersen Monday, June 24, 2013 1:52 PM
    • Marked as answer by fgclodt Tuesday, June 25, 2013 7:12 AM
    Monday, June 24, 2013 1:43 PM
  • Try adding the path to $PSModulePath in the PowerShell Profile.

    Mike

    That is part of the answer, but the question then becomes: what is the best way to make that change on all workstations and for all users? And also: what is the best way to distribute and maintain one of the allusers profiles so it will take effect in all cases?

    I guess you could maintain a central alluserscurrenthost profile, perhaps in the same location as the shared module repository. Of course, doing it that way will not make the modules available for those that are working offline. If you have people working offline, you might need to work out a strategy for updating copies of the modules and the alluserscurrenthosts profile that are stored locally.

    The other part of the question was the "or autoload" bit. This can be a bit tricky. It is easy to load all modules with a command in the profile like:

        get-module -listavailable | import-module

    There are a few issues though. If you have a large number of modules and they are loaded over your network, some users may find the waiting time frustrating when they type a command or run a script that doesn't actually use any of the modules. This might be addressed by maintaining the modules locally, and/or finding ways to import only those modules actually needed.

    Another issue has to do with programming and naming standards, and the principle of least surprise. if two modules each include a function of the same name, this will cause trouble. There are other ways for one module function to cause problems for other modules, so those maintaining the module repository and those supplying modules will need to have some guidelines developed to avoid such problems. There will also be a need to do reasonable testing before adding or updating any module, as such changes could break existing code. This highlights the need for version control and the possibility of rolling back changes to a known and working state.


    Al Dunbar -- remember to 'mark or propose as answer' or 'vote as helpful' as appropriate.

    • Marked as answer by Yan Li_ Wednesday, June 26, 2013 5:17 AM
    Monday, June 24, 2013 3:58 PM
  • I tend to copy scripts between servers in a standard location and use scripts over modules most of the time, however if I were to implement a solution for a shared enterprise module repo, I'd look at Tome Tanasovski's solution: http://powertoe.wordpress.com/2010/08/10/corporate-powershell-module-repository-part-1-design-and-infrastructure/
    • Marked as answer by fgclodt Tuesday, June 25, 2013 7:13 AM
    Monday, June 24, 2013 5:11 PM

All replies

  • Try adding the path to $PSModulePath in the PowerShell Profile.

    Mike

    • Proposed as answer by Malte Brodersen Monday, June 24, 2013 1:52 PM
    • Marked as answer by fgclodt Tuesday, June 25, 2013 7:12 AM
    Monday, June 24, 2013 1:43 PM
  • Try adding the path to $PSModulePath in the PowerShell Profile.

    Mike

    That is part of the answer, but the question then becomes: what is the best way to make that change on all workstations and for all users? And also: what is the best way to distribute and maintain one of the allusers profiles so it will take effect in all cases?

    I guess you could maintain a central alluserscurrenthost profile, perhaps in the same location as the shared module repository. Of course, doing it that way will not make the modules available for those that are working offline. If you have people working offline, you might need to work out a strategy for updating copies of the modules and the alluserscurrenthosts profile that are stored locally.

    The other part of the question was the "or autoload" bit. This can be a bit tricky. It is easy to load all modules with a command in the profile like:

        get-module -listavailable | import-module

    There are a few issues though. If you have a large number of modules and they are loaded over your network, some users may find the waiting time frustrating when they type a command or run a script that doesn't actually use any of the modules. This might be addressed by maintaining the modules locally, and/or finding ways to import only those modules actually needed.

    Another issue has to do with programming and naming standards, and the principle of least surprise. if two modules each include a function of the same name, this will cause trouble. There are other ways for one module function to cause problems for other modules, so those maintaining the module repository and those supplying modules will need to have some guidelines developed to avoid such problems. There will also be a need to do reasonable testing before adding or updating any module, as such changes could break existing code. This highlights the need for version control and the possibility of rolling back changes to a known and working state.


    Al Dunbar -- remember to 'mark or propose as answer' or 'vote as helpful' as appropriate.

    • Marked as answer by Yan Li_ Wednesday, June 26, 2013 5:17 AM
    Monday, June 24, 2013 3:58 PM
  • The user profile could be deployed via Group Policy (really, if you wanted the store modules locally, they could be deployed with Group Policy as well).

    Mike

    Monday, June 24, 2013 4:05 PM
  • I tend to copy scripts between servers in a standard location and use scripts over modules most of the time, however if I were to implement a solution for a shared enterprise module repo, I'd look at Tome Tanasovski's solution: http://powertoe.wordpress.com/2010/08/10/corporate-powershell-module-repository-part-1-design-and-infrastructure/
    • Marked as answer by fgclodt Tuesday, June 25, 2013 7:13 AM
    Monday, June 24, 2013 5:11 PM
  • I tend to copy scripts between servers in a standard location and use scripts over modules most of the time, however if I were to implement a solution for a shared enterprise module repo, I'd look at Tome Tanasovski's solution: http://powertoe.wordpress.com/2010/08/10/corporate-powershell-module-repository-part-1-design-and-infrastructure/

    That link addresses fairly directly the OP's concerns, including some that he has not specifically stated.

    Re: scripts versus modules, I started using script modules some time ago and, although I write the occasional script, I find that the use of modules generates a more "modular" structure to my use of powershell.

    That said, modules are best for functions that are likely going to be used interactively, or called from a variety of scripts or other module functions. The reason is that the command to import a module or call one of its functions does not need to specify the path, but can operate on a name-only basis.


    Al Dunbar -- remember to 'mark or propose as answer' or 'vote as helpful' as appropriate.

    Monday, June 24, 2013 7:29 PM
  • Hi,

    Just checking in to see if the suggestions were helpful. Please let us know if you would like further assistance.

     

    If you have any feedback on our support, please click here .

     


    Cataleya Li
    TechNet Community Support

    Tuesday, June 25, 2013 6:40 AM
  • Hi there,

    Thanks to all of you for your helpful answers!

    Yours

     FG Clodt


    fgc

    Tuesday, June 25, 2013 7:17 AM