FUDforum
Fast Uncompromising Discussions. FUDforum will get your users talking.

Home » Imported messages » comp.lang.php » Asynchronous FTP Upload
Show: Today's Messages :: Polls :: Message Navigator
Return to the default flat view Create a new topic Submit Reply
Re: Asynchronous FTP Upload [message #172113 is a reply to message #172091] Mon, 31 January 2011 01:07 Go to previous messageGo to previous message
Peter H. Coffin is currently offline  Peter H. Coffin
Messages: 245
Registered: September 2010
Karma:
Senior Member
On Sun, 30 Jan 2011 17:06:40 +0000, The Natural Philosopher wrote:
> Peter H. Coffin wrote:
>> On Sun, 30 Jan 2011 15:48:29 +0100, Luuk wrote:
>>> On 30-01-11 15:20, Jerry Stuckle wrote:
>>>> Not really. You're not going to get 1gb/sec. or even 200mb/sec. from
>>>> the disk drive, especially not continuously. So even if the download
>>>> speed on the other end is 200mb/sec, that's still not going to be a
>>>> limiting factor.
>>>>
>>>> And forcing the disk to pull data from several different files on the
>>>> disk will slow disk overall disk access even more, especially if the
>>>> files are contiguous.
>>> But if the files are send to 500 hosts, the file might be in cache, if
>>> enough memory is available, which should speed up disk-access again.. ;)
>>
>> That'd be likely true for for a Very Large system, but I'd not bet on it
>> for the "hundreds of megabytes" original situation unles it's a
>> completely dedicated system/cache. There's still other processes going
>> on that are going to end up with their own stuff.
>>
> of course it will be cached if there is adequate memory to do it. If
> there isn't, add more.

Sometimes that's a solution.

> At est on Linux EVERYTHING is cached to the memory limit: Only if a
> process needs more real physical ram than is available, will the
> existing disk cache buffers be flushed.

That's kinda the point. EVERYTHING gets cached, without much attention
being paid to what's going to be needed again and when. (At least, that
was the situation when I last looked at it; there wasn't any way to tell
the OS "This file is more important to keep in cache than that one.")
Which on a busy system means you have a very FULL cache, but not
necessarily that the right parts of a Large File are going to be there.
The next bit needed to send to a download may well have already been
purged because some log file wanted its space in the cache.

writing a dedicated downloader app that can allocate that much memory
for the whole file, then share that memory explicitly with many little
clones of itself will ensure that purging needed bits doesn't happen.
And a tool like that can probably come very, very close to saturating
the outbound link, especially if there's something that's being semi
intelligent about managing the spawed processes. For example, it could
keep track of the average speeds to various sites and when it sees that
it has X kbps left between expected link speed and what's currently
being sent, and it can then select another site to send to based on it
being the highest historical rate that fits into the remaining bandwidth
window.

But this is, I think getting a little too far into the design aspect,
though I think it's probably still within PHP's capacity.

--
"It's 106 light-years to Chicago, we've got a full chamber of anti-
matter, a half a pack of cigarettes, it's dark, and we're wearing
visors."
"Engage."
[Message index]
 
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Read Message
Previous Topic: help to debug a simple php preg_replace
Next Topic: REQ: Looking for a script program called paCheckbook
Goto Forum:
  

-=] Back to Top [=-
[ Syndicate this forum (XML) ] [ RSS ]

Current Time: Tue Nov 26 11:30:23 GMT 2024

Total time taken to generate the page: 0.04150 seconds