Efficiency of a lot of variables [message #170856] |
Sat, 04 December 2010 00:52 |
jwcarlton
Messages: 76 Registered: December 2010
Karma:
|
Member |
|
|
I have 1,000 variables, written like this:
// Block 1
$hash['arr1']['var1'] = "whatever1";
$hash['arr1']['var2'] = "whatever2";
:
$hash['arr1']['var10'] = "whatever10";
// Block 2
$hash['arr2']['var1'] = "whateverelse1";
$hash['arr2']['var2'] = "whateverelse2";
:
$hash['arr2']['var10'] = "whateverelse10";
There are 100 blocks, and 10 variables in each block. The block used
is determined based on the domain used to access the site (there are
99 domains parked on top of the primary account). In the sample above,
'arr1' and 'arr2' represent the domains, and these are loaded on every
page of the site.
I've been researching this for awhile, and most seem to agree that
it's faster / more efficient to store all 1,000 variables into arrays,
instead of 100 blocks of if-else or switch-case. I had also considered
having 100 text files, and then just loading the appropriate text file
based on the domain (like below), but most believed that the I/O speed
would be worse than just loading all 1,000 at once:
$domain = "whatever";
list($var1, $var2,..., $var10) = FILE("/path/to/variables-
$domain.txt");
So, I guess my first question is, do you guys agree that storing them
all at once is faster / more efficient than the 3 alternatives? Or can
you suggest another option that I haven't considered?
If this is the best method, my next question is in regards to the
construct of the arrays. In the sample above, I have a single
multidimensional array; is that better than, for example, 100 arrays
with 10 dimensions:
$arr1['var1'] = "whatever1";
$arr1['var2'] = "whatever2";
:
$arr1['var10'] = "whatever10";
Or vice versa, 10 arrays with 100 dimensions:
$var1['arr1'] = "whatever1";
$var2['arr1'] = "whatever2";
:
$var10['arr1'] = "whatever10";
I know that I'm probably worrying too much about microseconds, but I
currently have an average of 600,000 pageviews a day, and expect it to
increase to at least 12,000,000 within the next year, so I'm trying to
make sure everything is as fast as possible before the problems set
in :-)
TIA,
Jason
|
|
|