Please login or register.

Login with username, password and session length
Advanced search  

News:

You need/want an older version of sNews ? Download an older/unsupported version here.

Author Topic: What is the impact in memory of big language data?  (Read 174 times)

sibas

  • Sr. Member
  • ****
  • Karma: 23
  • Posts: 451
    • www.simply4all.net
What is the impact in memory of big language data?
« on: March 03, 2013, 09:46:37 AM »

I use big amount of data in language files, switching them by Multi-language simple Mod

So far each language file has 53600 bytes and I expect at the end to be double or triple.

I donít know what the impact in memory is, and general in snews performance by using so big languages files.

Can someone enlighten me?
Does anyone use big languages files before?
Logged

nukpana

  • Hero Member
  • *****
  • Karma: 71
  • Posts: 663
Re: What is the impact in memory of big language data?
« Reply #1 on: March 03, 2013, 04:26:00 PM »

What kind of data are we talking about here? Really, the language strings should just be that, strings and any HTML/CSS/JS really shouldn't be a part of this.
Logged

sibas

  • Sr. Member
  • ****
  • Karma: 23
  • Posts: 451
    • www.simply4all.net
Re: What is the impact in memory of big language data?
« Reply #2 on: March 03, 2013, 05:10:20 PM »

Hey nukpana
The most of the data is plain text, although they are few parts where contains <p> <strong> <em> <br />

The biggest $l['']part is 932 bytes and have also 5 <li></li>

Now even if I remove all the html tags the amount of those html data is negligible,
my question still exists

here is my problem, now I use only two lang files and the total amount of the files I think is going to be 200000 bytes

but what if I use for example 5 languages EN GR NL FR ES and each of them are 120000 bytes the total is 600000 bytesÖ.  ::)
Of course are not loaded all but only when someone switch the languages

What I'm trying to figure out is if it is possible for the memory management of so large lang files.
In localhost tests I donít see problems and everything is working OK  :-\
Logged

Keyrocks

  • Doug
  • ULTIMATE member
  • ******
  • Karma: 449
  • Posts: 6019
  • Semantically Challenged
    • snews.ca
Re: What is the impact in memory of big language data?
« Reply #3 on: March 03, 2013, 05:45:47 PM »

I don't see this as a big memory management problem. Several Content Management applications whitten in PHP have many, many PHP files, not only for language variables (in several different languages) but for plug-ins or modules and core functionality.

PHP files are all loaded and read (parsed) by Apache (on servers running Linux operating systems) by the host server on the host server. We refer to this as server-side and the PHP files - collectively - generate the source code sent from the server to a user's browser.

sNews - index.php loads first
The first file Apache (the Linux server's server app) looks for and loads is always the index file.
Apache's httpd.conf file (in my localhost installation) is configured to search for an index file in this order:
1.   index.php
2.   index.php4
3.   index.php3
4.   index.cgi
5.   index.pl
6.   index.html
7.   index.htm
8.   index.shtml
9.   index.phtml

The sNews package uses the index.php file in its directory root, so Apache finds and loads it first.
We include the snews.php file (with all of the PHP functions) at the top of the index.php file, so all of the sNews site's PHP gets loaded before the rest of the index.php file gets parsed by the server.

Only one language variable file - the one used by the site by default (as defined in the website's setting panel) will be loaded and parsed (on the host server) when a user visits the website. If the user chooses a different language while visiting the site, then the server (Apache) will find that file and include it.

At least that's the way I understand it.  :)
Logged
Do it now... later may not come.
-------------------------------------------------------------------------------------------------
sNews 1.6 MESU | sNews 1.6 MEMU

nukpana

  • Hero Member
  • *****
  • Karma: 71
  • Posts: 663
Re: What is the impact in memory of big language data?
« Reply #4 on: March 05, 2013, 12:35:57 PM »

Hey nukpana
The most of the data is plain text, although they are few parts where contains <p> <strong> <em> <br />

The biggest $l['']part is 932 bytes and have also 5 <li></li>

Now even if I remove all the html tags the amount of those html data is negligible,
my question still exists

here is my problem, now I use only two lang files and the total amount of the files I think is going to be 200000 bytes

but what if I use for example 5 languages EN GR NL FR ES and each of them are 120000 bytes the total is 600000 bytesÖ.  ::)
Of course are not loaded all but only when someone switch the languages

What I'm trying to figure out is if it is possible for the memory management of so large lang files.
In localhost tests I donít see problems and everything is working OK  :-\

I just wanted to make sure you weren't duplicating your database to language strings or loading data within the language string as JSON.  Me personally, would still remove the HTML bits from the language string.

I wouldn't worry.  Only one file will be held in memory, not all of the files. You can always check by using memory_get_usage();
Logged

sibas

  • Sr. Member
  • ****
  • Karma: 23
  • Posts: 451
    • www.simply4all.net
Re: What is the impact in memory of big language data?
« Reply #5 on: March 05, 2013, 04:26:01 PM »

Thanks dudes, I try memory_get_usage
and the results is something like this

first page                      memory = 1217008
click in one article           memory = 1234784
reload the same article    memory = 1234992
change the language       memory = 1254952
go to one category         memory = 1230216
change again language    memory = 1213056
home again                    memory = 1218376
change again language    memory = 1238232
change again language    memory = 1219744

I will make some script to write the memory in every page to have better results,
and if stay like this I think I don't have any problem..
Logged

Keyrocks

  • Doug
  • ULTIMATE member
  • ******
  • Karma: 449
  • Posts: 6019
  • Semantically Challenged
    • snews.ca
Re: What is the impact in memory of big language data?
« Reply #6 on: March 06, 2013, 08:49:23 PM »

Interesting results. It appears memory usage isn't increasing incrementally by changing languages.
Are these numbers server memory use or page-loading memory used in the browser?
Logged
Do it now... later may not come.
-------------------------------------------------------------------------------------------------
sNews 1.6 MESU | sNews 1.6 MEMU

sibas

  • Sr. Member
  • ****
  • Karma: 23
  • Posts: 451
    • www.simply4all.net
Re: What is the impact in memory of big language data?
« Reply #7 on: March 07, 2013, 08:41:23 AM »

Good question!

but trully I don't know,
by using memory_get_usage(true) or memory_get_peak_usage(true)
I get slight higher values but are more stable, memory does not jump up and down so much..

BUT I found another solution  :D , is "easy" fast and working very smooth according to my tests ;D

here is my small solution for a page "f.a.q" where I want this page to display it in multi languages

lang EN
Quote
$l['faqPage']='/lang/faq_page_EN.txt';

lang GR
Quote
$l['faqPage']='/lang/faq_page_GR.txt';

and so on...

add this small function to snews, or page
Code: [Select]
function loadFaq(){
readfile(dirname(__FILE__).l('faqPage'));
}

create a new page and add
[func]loadFaq:|:[/func]

voila!!!!!

and according to PHP manual
Quote
readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
   ;D ;D ;D

« Last Edit: March 07, 2013, 08:43:21 AM by sibas »
Logged