Jump to content
Sign in to follow this  
lemony

InetGet() problem

Recommended Posts

lemony

Hi guys!

When you download a file with InetGet() the file is downloaded to the hard drive, but it is stored twice. The first copy of the file is stored in the download location that you set. The 2nd copy is a "real-time" copy of the file and is stored in your Temporary Internet Files folder. This is causing twice the CPU usage, and twice the hard drive space.

I found out about this when I wrote a script last night that was supposed to download a couple of gigs worth of files and I only had a few gigs of free space (enough to store those files). I let it download all night and woke up to find a bunch of write errors and I had to do a hard reboot because my machine froze up. When I got back in I found that my hard drive was 100% full.

So my question is, is there possibly any way to get InetGet() to download the file without storing a realtime copy in the temp folder?

Or is there an alternative to InetGet() or some workaround that I could do?

I want the file to download to the location I choose, without other copies scattered around.

Thanks if you can help.

Share this post


Link to post
Share on other sites
pugboy

Well, my first suggestion: clear up some space on your HDD.

I do not think that there is an alternative way, unless you use the FTP UDFs (but it requires the server to allow FTP)

Share this post


Link to post
Share on other sites
lemony

Hi pugboy, I definitely wont be able to use FTP.

I suppose I could free up some hard drive space, but that's not going to solve this problem. I only mentioned hard drive space in my first post because that is what allowed me to find out about this problem.

Lets forget about hard drive space. The fact is, files downloaded with InetGet() are being copied in realtime to two location on the hard drive. What's the purpose of this behavior? There is nothing good about it. It just puts unnecessary strain on the CPU and hard drive and really takes it's toll when downloading large files.

Is this something that the developers are going to address?

Edited by lemony

Share this post


Link to post
Share on other sites
pugboy

There is a function (_INetGetSource()) that does not write a temp file, but it is for viewing HTML source files.

Share this post


Link to post
Share on other sites
Confuzzled

I see on a Vista machine that a large download completes to 100% and then displays a message that it is copying the temporary file before allowing you to access the downloaded file and folder. Somehow I suspect this is a limitation of Windows, rather than the InetGet subroutine.

I wonder about your system stability if you need to worry about enough disk space for transient files.

Share this post


Link to post
Share on other sites
lemony

I see on a Vista machine that a large download completes to 100% and then displays a message that it is copying the temporary file before allowing you to access the downloaded file and folder. Somehow I suspect this is a limitation of Windows, rather than the InetGet subroutine.

I don't believe it's a Windows limitation. That's just how IE downloads files. Use any other software to download a file and it will be downloaded directly to the location where you choose.

IE downloads the file to the temp folder, then copies it over to the location where you wanted to save the file to. It's ridiculous, but that's IE for you.

Now InetGet() takes it to an even more ridiculous level. It simultaneously downloads the file to your temp folder AND the download location you want. Makes absolutely no sense. It's saving the file to the location where you want in realtime (that's good), yet it makes another realtime copy in the temp folder (that's bad).

All it should do is simply take the url you are giving it, and save the file to the location you told it to. I don't understand what the deal is with this simultaneous realtime copy in the temp folder. I've never seen any other downloader do this.

I wonder about your system stability if you need to worry about enough disk space for transient files.

Thanks for your reply and your concern, but please lets forget about it. I'm sorry I mentioned it. Lets just discuss InetGet() in this thread. That PC is just a crappy old PC with a 20gb hard drive that I don't really care about. It's not my main system. Please, lets just discuss InetGet() because I really want to either get this fixed, or at least find out why it behaves in this manor.

Share this post


Link to post
Share on other sites
Dattebayo

Sorry i have to revive a 1year old topic, but this is still very much an issue with InetGet in 3.3.0.0. As he says InetGet writes at specified location, as well as in the Internet Temporary Files. Both in realtime. meaning it requires twice the resources. What's even more disturbing is the fact it doesn't properly delete the temp file upon completion, It seems to be lacking an instruction not to write to default path if filepath is specified.

Edited by Dattebayo

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
Sign in to follow this  

×