|
Hello,
I am trying to set up a site to design calendars using my users' pictures. In order to do that I have built a form that users can attach files to it and send it to my server (using form2mail CGI script attached below). It works well but only for small files. For large files (I need to upload up to 13 files, each can be up to 5 Mb) it crashes after an hour and no files are saved. My host replied to me that there is a timeout on the server that they do not wish to increase.
Since the script works well for a single large file, my idea is to call it several times, and each time asks it to handle and upload only a single file. However, I want my users to fill the form and insert the names of the files they want to upload only once. So after the script is called the first time by the form, I need a way to call another script again (automatically, without user's intervention) and to pass the file names collected by teh first script to it. Since the script ends with calling an html thank you page, I though of using this page to call the second script using
<meta HTTP-EQUIV="REFRESH" content="0; url=http://gilep.netfirms.com/cgi-bin/gilepf2mcalendar.pl">
This command does eth job but again I have no idea how to pass the information from one script to the other.
The file names are stored in my @file_upload_fields.
BTW, I have no programming expertise what so ever.
Thanks for any help related to these issues.
Thanks,
Gil
|
|
|
within the program you can run command prompt commands by typing system "some-command";
so maybe you could within your script run another script by typing:
system "somescript.pl nextpicture";
the nextpicture would then become an argument into the next script,
which goes into @ARGV i think.
|
|
|
|
|
Hi,
Thanks for trying to help me out.
No, your suggestion didn't help or I didn't implement it correctly.
Let me try and calrify the problem. It starts when the user fills a form and selects 13 pictures located on his PC to be uploaded to my web site. As long as the files are not more than a few hundereds KB, everything works fine. When each of the files is 3 MB for example it crashes and no files are saved at all (also quite strange). When smaller number of large files is uploaded it works well. The script restriction for the file size (for each file) is set to 6 MB so this is not the issue. The reply I got from my server host was that there is a timeout that I probably exceed and this is why it crashes. They told me that the timeout is 5 minutes which is also quite strange as the script works fine and uploads for 20 and 30 minutes and when it crashes it is after an hour or more.
Since the script handles well a single large file, I thought of calling it several times but without bothering the user to enter each file name separately and do it "behind the scenes".
People suggested me to look for a solution based on PHP but I found out that I need to configure some parameters that my host does not allow me to do. Others suggested to look for a web based FTP client solution but that I couldn't find something that would be simple enough that the user will only have to select the files and push the upload button and it also requires to have a certain version of JAVA installed.
This is why I came up with this question.
Any other other ideas on how to solvesuch an issue woudl be mostly welcome.
|
|
|
|
|
|
|
// |