Jump to content

There's nothing here yet

  • Recently Browsing   0 members

    No registered users viewing this page.

  • Similar Content

    • By walec
      Hello
      The function below worked for a long time. Now it doesn't download anything and I don't know where the error is.
      #include <InetConstants.au3> Local $Link = 'https://allegro.pl/kategoria/nasiona-warzywa-99776?bmatch=e2101-d3681-c3682-hou-1-4-0319' Local $dStrona = InetRead($Link) Local $sStrona = BinaryToString($dStrona, $SB_UTF8) ;GUICtrlSetData($myedit, $sStrona) ConsoleWrite($sStrona) The function was good because it reading data in the background.
      Thank you for any hints or possibly other solutions / functions that are reading data in the background.
       
    • By nacerbaaziz
      goodmorning; autoit team
      please their are any youtube search way working?
      because i was using the get source and split it to get the result but know it does not working any way.
      is the youtube disabled that? and is their any other simple way to do that?
      i tested all examples found in this post but also it don't work
      https://www.autoitscript.com/forum/topic/123945-youtube-search/
      and here is the example that i use to and it don't work any more
      local $hSearchOpenHNDL, $hSearchConnect, $sSearchGet local $a_UrlsArray[1][5] local $b_ButtonsDisabled = false, $b_SearchBTNFocus = false, $b_SearchListFocus, $h_SearchFocusHND local $Return = "0" local $s_OpenStringY = "/feed/trending" local $ChannelUrl = "", $channelName = "" if Not ($a_YoutubeSearchArray[0][0] = 0) then GUICtrlSetData($searchInp, $s_youtubeSearchLastSearch) $s_OpenStringY = "/results?search_query=" & StringReplace(GUICtrlRead($searchInp), " ", "+") $a_UrlsArray = $a_YoutubeSearchArray for $i = 1 to $a_UrlsArray[0][0] _GUICtrlListBox_AddString($SearchList, $a_UrlsArray[$i][0] & $a_UrlsArray[$i][2] & $a_UrlsArray[$i][3]) next _GUICtrlListBox_SetCurSel($SearchList, $I_youtubeSearchLastIndex-1) GUICtrlSetState($SearchList, $GUI_focus) else if Ping("youtube.com", 1000) > 1 then $hSearchOpenHNDL = _WinHttpOpen('') if not (@Error) then $hSearchConnect = _WinHttpConnect($hSearchOpenHNDL, "youtube.com") if Not (@Error) then $sSearchGet = _WinHttpSimpleRequest($hSearchConnect, "get", $s_OpenStringY) if not (@Error) then local $a_strings = _StringBetween($sSearchGet, '<a href="/watch', "<ul") local $title = "" local $url = "" local $length = "" local $result = "" GUICtrlSetData($SearchList, "") ReDim $a_UrlsArray[1][5] for $i = 0 to UBound($a_strings)-1 $url = _StringBetween($a_strings[$i], "?", '"') if @error then ContinueLoop $url = "https://www.youtube.com/watch?" & $url[0] $title = _StringBetween($a_strings[$i], 'dir="', '</a>') if @error then ContinueLoop $title = $title[0] $title = StringRegExpReplace($title, '(.*\"\>)', "") if StringRegExp($a_strings[$i], '[0-9]{1,2}\:[0-9]{1,2}\:[0-9]{1,2}', 0) = 1 then $length = StringRegExp($a_strings[$i], '[0-9]{1,2}\:[0-9]{1,2}\:[0-9]{1,2}', 2) elseIf StringRegExp($a_strings[$i], '[0-9]{1,2}\:[0-9]{1,2}', 0) = 1 then $length = StringRegExp($a_strings[$i], '[0-9]{1,2}\:[0-9]{1,2}', 2) else $length = "" endIf if IsArray($length) then $length = ": (" & $length[0] & ")" else $length = "" endIf $ChannelUrl = stringRegexpReplace($a_strings[$i], '(^.*?<a.*?\"(\/user|\/channel))+', "$2") $channelName = stringRegexpReplace($ChannelUrl, '(.*?\".*?>)(.*</a>)+', "$2") $ChannelUrl = stringRegexpReplace($ChannelUrl, '(\".*)+', "") $channelName = stringRegexpReplace($channelName, '(</a>.*)+', "") $result &= $title & @crlf & $url & @crlf ReDim $a_UrlsArray[UBound($a_UrlsArray)+1][5] $a_UrlsArray[UBound($a_UrlsArray)-1][0] = $title $a_UrlsArray[UBound($a_UrlsArray)-1][1] = $url $a_UrlsArray[UBound($a_UrlsArray)-1][2] = $length if not ($channelName = "") then $a_UrlsArray[UBound($a_UrlsArray)-1][3] = ", (" & $channelName & ")" if not ($channelUrl = "") then $a_UrlsArray[UBound($a_UrlsArray)-1][4] = "https://www.youtube.com" & $channelUrl $a_UrlsArray[0][0] = UBound($a_UrlsArray)-1 _GUICtrlListBox_AddString($SearchList, $a_UrlsArray[UBound($a_UrlsArray)-1][0] & $length & $a_UrlsArray[UBound($a_UrlsArray)-1][3]) next endIf endIf endIf endIf endIf  
      i hope any one can help me
      thanks in advance
    • By Blueman
      Hi all,
      I was wondering if you can help me with the function called; InetRead().
      My scripts are using this function a lot for several conditions and everything works fine!
      But sometimes when the server is a little bit buggy of simply not available my script is hanging.
      It takes about 90sec before this function returns a Timeout, when i adjust the parameter it still is hanging about 90sec.
      The following script is a example where the script is hanging for aprox. 90sec;
      ; Set Timeout to 2sec AutoItSetOption ("TCPTimeout", 2000) ; Read Website InetRead("http://www.geenverbinding.nl/",1) ; Show Msgbox before Ending Script. Msgbox(64,"","Finished")  
      The following script is a example where the script show the Msgbox pretty fast;

       
      ; Set Timeout to 2sec AutoItSetOption ("TCPTimeout", 2000) ; Read Website InetRead("http://www.google.nl/",1) ; Show Msgbox before Ending Script. Msgbox(64,"","Finished")  
      My question now is, what am i doing wrong and/or is there a other way to prevent Hanging the script?
      Thanks all
       
    • By 31290
      Hi everyone, hope you are doing fine
      Well, I'm currently writing a small script that goes to a certain web page, finds the first link of a specified section and download the file associated to this link.
      Depending on the computer that the tool is launched, the script gets the computer model and search in the (provided here) ini file which link to follow.
      At first, Dell was kind enough to provide only one link but now, they provide two of them. The first one is now a .txt file (  ) whereas my script has been designed to download only the fist and latest link released for the BIOS Update.

      Here's the current code which is working with only the first and latest link of the BIOS category:
      So the question is: 
      In the case of double links like shown in the picture above, how it is possible to tell the script to download only the link containing an the .exe file?
      Of course, I could have changed the array result to [1] instead of [0] [which is working] but it seems that Dell does that randomly and that I deal with a lot of computer models.
      Thanks for the help you can provide, 
      -31290-
       
      SEE_BIOS_LINKS.ini
    • By Mark917
      I am trying to extract a date and time from a website using InetRead.  The source for the page I am reading shows the date and time as shown below.  I need to somehow extract the correct date and time from this information.  Is there a function or calculation that will do this?  The source shown here correlates to 15:20 (time) 29.03.17 (date).
      <div class="cell game-date" data-time="1490818800000"> <div class="site__time" data-time="1490818800000"> <div class="time"></div> <div class="date"></div> </div> </div>  
×
×
  • Create New...