Please login or register.

Login with username, password and session length
Advanced search  

News:

You need/want an older version of sNews ? Download an older/unsupported version here.

Pages: 1 2 [3] 4 5 ... 10
 21 
 on: March 07, 2013, 08:41:23 am 
Started by sibas - Last post by sibas
Good question!

but trully I don't know,
by using memory_get_usage(true) or memory_get_peak_usage(true)
I get slight higher values but are more stable, memory does not jump up and down so much..

BUT I found another solution  :D , is "easy" fast and working very smooth according to my tests ;D

here is my small solution for a page "f.a.q" where I want this page to display it in multi languages

lang EN
Quote
$l['faqPage']='/lang/faq_page_EN.txt';

lang GR
Quote
$l['faqPage']='/lang/faq_page_GR.txt';

and so on...

add this small function to snews, or page
Code: [Select]
function loadFaq(){
readfile(dirname(__FILE__).l('faqPage'));
}

create a new page and add
[func]loadFaq:|:[/func]

voila!!!!!

and according to PHP manual
Quote
readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
   ;D ;D ;D


 22 
 on: March 06, 2013, 08:49:23 pm 
Started by sibas - Last post by Keyrocks
Interesting results. It appears memory usage isn't increasing incrementally by changing languages.
Are these numbers server memory use or page-loading memory used in the browser?

 23 
 on: March 05, 2013, 04:26:01 pm 
Started by sibas - Last post by sibas
Thanks dudes, I try memory_get_usage
and the results is something like this

first page                      memory = 1217008
click in one article           memory = 1234784
reload the same article    memory = 1234992
change the language       memory = 1254952
go to one category         memory = 1230216
change again language    memory = 1213056
home again                    memory = 1218376
change again language    memory = 1238232
change again language    memory = 1219744

I will make some script to write the memory in every page to have better results,
and if stay like this I think I don't have any problem..

 24 
 on: March 05, 2013, 12:35:57 pm 
Started by sibas - Last post by nukpana
Hey nukpana
The most of the data is plain text, although they are few parts where contains <p> <strong> <em> <br />

The biggest $l['']part is 932 bytes and have also 5 <li></li>

Now even if I remove all the html tags the amount of those html data is negligible,
my question still exists

here is my problem, now I use only two lang files and the total amount of the files I think is going to be 200000 bytes

but what if I use for example 5 languages EN GR NL FR ES and each of them are 120000 bytes the total is 600000 bytesÖ.  ::)
Of course are not loaded all but only when someone switch the languages

What I'm trying to figure out is if it is possible for the memory management of so large lang files.
In localhost tests I donít see problems and everything is working OK  :-\

I just wanted to make sure you weren't duplicating your database to language strings or loading data within the language string as JSON.  Me personally, would still remove the HTML bits from the language string.

I wouldn't worry.  Only one file will be held in memory, not all of the files. You can always check by using memory_get_usage();

 25 
 on: March 03, 2013, 05:45:47 pm 
Started by sibas - Last post by Keyrocks
I don't see this as a big memory management problem. Several Content Management applications whitten in PHP have many, many PHP files, not only for language variables (in several different languages) but for plug-ins or modules and core functionality.

PHP files are all loaded and read (parsed) by Apache (on servers running Linux operating systems) by the host server on the host server. We refer to this as server-side and the PHP files - collectively - generate the source code sent from the server to a user's browser.

sNews - index.php loads first
The first file Apache (the Linux server's server app) looks for and loads is always the index file.
Apache's httpd.conf file (in my localhost installation) is configured to search for an index file in this order:
1.   index.php
2.   index.php4
3.   index.php3
4.   index.cgi
5.   index.pl
6.   index.html
7.   index.htm
8.   index.shtml
9.   index.phtml

The sNews package uses the index.php file in its directory root, so Apache finds and loads it first.
We include the snews.php file (with all of the PHP functions) at the top of the index.php file, so all of the sNews site's PHP gets loaded before the rest of the index.php file gets parsed by the server.

Only one language variable file - the one used by the site by default (as defined in the website's setting panel) will be loaded and parsed (on the host server) when a user visits the website. If the user chooses a different language while visiting the site, then the server (Apache) will find that file and include it.

At least that's the way I understand it.  :)

 26 
 on: March 03, 2013, 05:10:20 pm 
Started by sibas - Last post by sibas
Hey nukpana
The most of the data is plain text, although they are few parts where contains <p> <strong> <em> <br />

The biggest $l['']part is 932 bytes and have also 5 <li></li>

Now even if I remove all the html tags the amount of those html data is negligible,
my question still exists

here is my problem, now I use only two lang files and the total amount of the files I think is going to be 200000 bytes

but what if I use for example 5 languages EN GR NL FR ES and each of them are 120000 bytes the total is 600000 bytesÖ.  ::)
Of course are not loaded all but only when someone switch the languages

What I'm trying to figure out is if it is possible for the memory management of so large lang files.
In localhost tests I donít see problems and everything is working OK  :-\

 27 
 on: March 03, 2013, 04:26:00 pm 
Started by sibas - Last post by nukpana
What kind of data are we talking about here? Really, the language strings should just be that, strings and any HTML/CSS/JS really shouldn't be a part of this.

 28 
 on: March 03, 2013, 09:46:37 am 
Started by sibas - Last post by sibas
I use big amount of data in language files, switching them by Multi-language simple Mod

So far each language file has 53600 bytes and I expect at the end to be double or triple.

I donít know what the impact in memory is, and general in snews performance by using so big languages files.

Can someone enlighten me?
Does anyone use big languages files before?

 29 
 on: March 03, 2013, 12:22:43 am 
Started by marengo - Last post by marengo
Quote
No problem finding the admin login panel for your site.

Oh, yes, thank you, I tried a wrong URL so  :o
http://frerealbert.be/login/ works just fine

Sorry for the confusion and thank you for the help

Regards,
Nathalie

 30 
 on: March 02, 2013, 08:57:06 pm 
Started by marengo - Last post by Fred K
Quote from: marengo
admin page cannot be found http://frerealbert.be/admin

The url is wrong, in snews 1.7 the admin area "home" url is http://domain.com/administration/ - so the question is why the wrong url? Is it manually constructed? Anyway, if you're using sNews 1.7 the incorrect url would be the reason why the admin page cannot be found.

Pages: 1 2 [3] 4 5 ... 10