vinnyMS1 0 Posted September 28, 2022 Share Posted September 28, 2022 i have "https://wannabuy-4088.myshopify.com/products/" and a text list file with more url to add after "/products/" like "products/christmas-tree-sweater-99929" from the "url" text file list the wanted result is: /folder/christmas-tree-sweater-99929.html url.txt Link to post Share on other sites
argumentum 905 Posted September 28, 2022 Share Posted September 28, 2022 so, are you the owner of the site or just wanna scrape it ? do you know HTML ? what is this for ?, what do you wanna do with this ?, what's the end goal ? Follow the link to see my signature's stuff.FAQ - Please Read Before Posting. Link to post Share on other sites
Subz 823 Posted September 29, 2022 Share Posted September 29, 2022 @argumentum Shopify is an online shopping cart host, that anyone can join. It appears he wants Vinny wants to prepend and append too each line of text. @vinnyMS1 Couple of quick examples: #include <Array.au3> #include <File.au3> Global $g_sPrefix = "/products/" Global $g_sSuffix = ".html" ;~ Example 1 Global $g_aURLs _FileReadToArray(@ScriptDir & "\url.txt", $g_aURLs) If @error Then Exit MsgBox(4096, "Error", "Unable to read url.txt to Array") _ArrayDisplay($g_aURLs, "Before Changes") For $i = 1 To $g_aURLs[0] $g_aURLs[$i] = $g_sPrefix & $g_aURLs[$i] & $g_sSuffix Next _ArrayDisplay($g_aURLs, "After Changes") ;~ Example 2 $sURLs = $g_sPrefix & StringRegExpReplace(FileRead(@ScriptDir & "\url.txt"), "(?:\r\n|\n)", $g_sSuffix & @LF & $g_sPrefix & "$1") & $g_sSuffix ConsoleWrite(StringReplace($sUrls,"/products/.html", "") & @CRLF) argumentum 1 Link to post Share on other sites
vinnyMS1 0 Posted September 29, 2022 Author Share Posted September 29, 2022 (edited) i want to download each page and save it as a html file from https://wannabuy-4088.myshopify.com/products/christmas-tree-sweater-99929 to christmas-tree-sweater-99929.html on local folder i'm going to create a new website with html files but no buy option Edited September 29, 2022 by vinnyMS1 Link to post Share on other sites
Werty 574 Posted September 29, 2022 Share Posted September 29, 2022 Maybe use a tool that is already tailored for such tasks, like HTTrack, it copies your website and saves it locally. https://www.httrack.com/ You can use filters, here's a short guide. https://www.httrack.com/page/20/en/index.html taurus905 1 Some guy's script + some other guy's script = my script! Link to post Share on other sites
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now