none
Inconsistency experienced in downloading a web page using powershell RRS feed

  • Question

  • Dear All,

    I have been experiencing some inconsistency with the way a web page gets downloaded using powershell.

    This is the script

    $url2 = ("webpageurl")
    $credentials = get-credential
    $webobj = New-Object system.net.webclient
    $webobj.credentials = $credentials
    $content=$webobj.downloadstring($url2)
    $content

    Here webpageurl is an internal webpage that takes a different credentials other than the one with which i login in to my PC.

    ==============================

    When running the above script, some time the webpage gets downloaded and sometimes it just fails with the following error.

    Exception calling "DownloadString" with "1" argument(s): "The remote server returned an error: (401) Unauthorized."
    At C:\Users\schandras_adm\Documents\windowspowershell\get-ndswebpage.ps1:6 char:32
    + $content=$webobj.downloadstring <<<< ($url2)
        + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
        + FullyQualifiedErrorId : DotNetMethodException

    For Example. If i run this says 6 times, it works the 7th time and 8th time it may fail again and so on. Hope you are getting my point.

    I am just confused as to why it fails sometimes and sometimes it works without any change in the script or the credentials.

    Please help!

    Wednesday, November 18, 2015 1:54 PM

Answers

  • $url2 = ('weburl')
    $password = ConvertTo-SecureString -String "password" -AsPlainText -force
    $user = "domain\username"
    $credentials = new-object system.management.automation.pscredential($user,$password)
    $webobj = New-Object system.net.webclient
    $webobj.credentials = $credentials
    do{
           $erroroccurred=$false
         try{
            $content=$webobj.downloadstring($url2) 
            }
        catch
           {$erroroccurred=$True}
    
       }while($erroroccurred)
    $content

    Thanks Mike for your concern and suggestion. Thanks jrv. When i said it will not be encouraged, it meant they are already loaded with other issues to deal with and they will either ignore my request to check the cause of this issue or take for ever to work on this.

    So based on your suggestions i have come up with above that just works perfect.

    Jrv- Let me know if there is any modification to the above.

    • Marked as answer by Moonshekar Thursday, November 19, 2015 2:47 PM
    Thursday, November 19, 2015 11:48 AM
  • This would be cleaner:

    While(1){
    	try {$content = $webobj.downloadstring($url2);break}
    	catch {sleep -s 10}
    }
    $content


    \_(ツ)_/


    • Edited by jrv Thursday, November 19, 2015 2:35 PM
    • Marked as answer by Moonshekar Thursday, November 19, 2015 2:45 PM
    Thursday, November 19, 2015 2:34 PM

All replies

  • Most likely you have typed the password wrong.


    \_(ツ)_/

    Wednesday, November 18, 2015 2:02 PM
  • Hi Jrv, 

    You might have missed reading below. There was no change in username or password.

    "

    For Example. If i run this says 6 times, it works the 7th time and 8th time it may fail again and so on. Hope you are getting my point.

    I am just confused as to why it fails sometimes and sometimes it works without any change in the script or the credentials.

    "


    Wednesday, November 18, 2015 4:06 PM
  • Have you looked at the web server for clues? Since this is intermittent, this very likely isn't a scripting issue.

    Wednesday, November 18, 2015 4:10 PM
  • I can access the webpage from a browser without any issues. I don't mange the web server. The requirement is to Extract some required information from the page for further use.
    Wednesday, November 18, 2015 4:17 PM
  • jrv may have a point. Unauthorized is a permissions issue.

    As you are using get-credential there is the chance you are entering the password incorrectly. I would store your password in the script using securestring (see http://www.adminarsenal.com/admin-arsenal-blog/secure-password-with-powershell-encrypting-credentials-part-1)

    this will ensure you are using the same credentials each time and eliminate that from your list of possiblities

    Wednesday, November 18, 2015 4:20 PM
  • I can access the webpage from a browser without any issues. I don't mange the web server. The requirement is to Extract some required information from the page for further use.

    I'd suggest talking to the people who manage the server then.


    Wednesday, November 18, 2015 4:20 PM
  • According to your scripts you have to type in the credentials each time.  Perhaps you are not typing the same thing each time.

    Continuous re-authentication on some web sites can cause issues.  It is not a scripting issue.


    \_(ツ)_/

    Wednesday, November 18, 2015 4:21 PM
  • Do it this way:

    $url ='webpageurl'
    Invoke-WebRequest -Uri $url -Credential youraccountid


    \_(ツ)_/

    Wednesday, November 18, 2015 4:24 PM
  • I am using or infact forced to use powershell 2.0

    I tried the following for testing 

    $password = ConvertTo-SecureString -String "password" -AsPlainText -force
    $user = "domain\username"
    $credentials = new-object system.management.automation.pscredential($user,$password)


    result was same so not an issue with user name and password.



    Wednesday, November 18, 2015 4:42 PM
  • I am using or infact forced to use powershell 2.0

    I tried the following for testing 

    $password = ConvertTo-SecureString -String "password" -AsPlainText -force
    $user = "domain\username"
    $credentials = new-object system.management.automation.pscredential($user,$password)


    result was same so not an issue with user name and password.



    So you have now proven that it is not a scripting or user issue so contact the web master or admin and have them check the logs to see why you are being rejected.  The issue cannot be fixed in script.


    \_(ツ)_/

    Wednesday, November 18, 2015 4:48 PM
  • Finding admin for the webpage is not going to work in my environment as what i am doing would not be encouraged.

    I have come up with the following workaround

    $url2 = ("weburl")
    $password = ConvertTo-SecureString -String "password" -AsPlainText -force
    $user = "domain\username"
    $credentials = new-object system.management.automation.pscredential($user,$password)
    $webobj = New-Object system.net.webclient
    $webobj.credentials = $credentials
    do{
        $content=$webobj.downloadstring($url2)
        $credentials
      }until($content)

    Now the webpage gets downloaded after a few tries however while the try fails, powershell keeps throwing following error untill the download is successfull.

    Exception calling "DownloadString" with "1" argument(s): "The remote server returned an error: (401) Unauthorized."
    At C:\Users\schandras_adm\Documents\windowspowershell\get-ndswebpage2.ps1:9 char:36
    +     $content=$webobj.downloadstring <<<< ($url2)
        + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
        + FullyQualifiedErrorId : DotNetMethodException

    How do i make this error to be ignored or how do i make it not appear in the powershell window while the try fails ?

    Wednesday, November 18, 2015 5:49 PM
  • You are doing a few bad things:

    Do it this way.

    $url2='weburl'0
    $password = ConvertTo-SecureString -String "password" -AsPlainText -force
    $user = "domain\username"
    
    $credentials = new-object system.management.automation.pscredential($user,$password)
    $webobj = New-Object system.net.webclient
    $webobj.credentials = $credentials
    
    do{
        $content=$webobj.downloadstring($url2)
        sleep -s 10
    }until($content)
    Don't add more than single quotes on URL.  Don't loop fast and don't constantly output $credentials.


    \_(ツ)_/


    • Edited by jrv Wednesday, November 18, 2015 6:07 PM
    Wednesday, November 18, 2015 6:06 PM
  • How do i make this error to be ignored or how do i make it not appear in the powershell window while the try fails ?
    EDIT: On second thought, I shouldn't post this information, due to your other comment.
    Finding admin for the webpage is not going to work in my environment as what i am doing would not be encouraged.

    You shouldn't be doing it then. Are you really willing to risk your job over this?


    Wednesday, November 18, 2015 6:07 PM
  • You may find, if what you are doing is to be discouraged, that the site is rejecting you because of repeated sign-ons?

    Mikes right on two scores...the web admin is the person who can help you and..

    If you know you shouldn't do it, don't do it.

    Thursday, November 19, 2015 8:47 AM
  • $url2 = ('weburl')
    $password = ConvertTo-SecureString -String "password" -AsPlainText -force
    $user = "domain\username"
    $credentials = new-object system.management.automation.pscredential($user,$password)
    $webobj = New-Object system.net.webclient
    $webobj.credentials = $credentials
    do{
           $erroroccurred=$false
         try{
            $content=$webobj.downloadstring($url2) 
            }
        catch
           {$erroroccurred=$True}
    
       }while($erroroccurred)
    $content

    Thanks Mike for your concern and suggestion. Thanks jrv. When i said it will not be encouraged, it meant they are already loaded with other issues to deal with and they will either ignore my request to check the cause of this issue or take for ever to work on this.

    So based on your suggestions i have come up with above that just works perfect.

    Jrv- Let me know if there is any modification to the above.

    • Marked as answer by Moonshekar Thursday, November 19, 2015 2:47 PM
    Thursday, November 19, 2015 11:48 AM
  • Thanks Mike for your concern and suggestion. Thanks jrv. When i said it will not be encouraged, it meant they are already loaded with other issues to deal with and they will either ignore my request to check the cause of this issue or take for ever to work on this.

    Okay, in that case you've already hit on exactly what I originally posted. Try/catch is what you're looking for.

    Thursday, November 19, 2015 12:59 PM
  • This would be cleaner:

    While(1){
    	try {$content = $webobj.downloadstring($url2);break}
    	catch {sleep -s 10}
    }
    $content


    \_(ツ)_/


    • Edited by jrv Thursday, November 19, 2015 2:35 PM
    • Marked as answer by Moonshekar Thursday, November 19, 2015 2:45 PM
    Thursday, November 19, 2015 2:34 PM