Please login or register.

Login with username, password and session length
Advanced search  
Pages: 1 2 [3] 4 5 ... 10
 21 
 on: March 14, 2013, 11:18:08 am 
Started by nukpana - Last post by nukpana
http://googleblog.blogspot.com/2013/03/a-second-spring-of-cleaning.html

Not sure how many of you use Google Reader as a RSS reader or use it's services, but this is a heads up if you haven't heard.

 22 
 on: March 08, 2013, 10:09:30 pm 
Started by matthiasrawles - Last post by Keyrocks
Hi Pat ... welcome back!
With I had time to help you on this one but I'm off in a hurry for now. Sounds strange though, I've never had this problem yet so I probably woudn't be able to offer any clues at this point. Hope someone else can.

 23 
 on: March 08, 2013, 06:31:49 pm 
Started by matthiasrawles - Last post by Patric Ahlqvist
Mmmm, I'm having the exact same problem... I haven't been at this for quite some time, and is being asked to retrieve/reset a password on an old site I did. So as far as I remember I go at it, access the DB changes the pass and user string to be that MD5 hash stringy thingy up there wich should result in "test", and I'm so happy I remembered this... But alas, my 'appiness isn't longlasting as the login fails...

I would very much appriciate assistans, Bob, Doug, Fredde, anywhoooo might have input here. All things mentioned here is checked and thus far nothing. Same host as I've always used, nothing wrong with "bom", .htaccess, or anything else... bugger me, what the frack is this ?

this site have been up and running, and everyone involved have been able to logon, but this is now forgotten, by me changed to test/test, and there isn't anything happening...

With sincere hopes of aid, and a wish for a 'a'appy weekend to everyone.

Patric.

 24 
 on: March 07, 2013, 01:25:18 pm 
Started by bobcat - Last post by lebohang
I would personally like to see the person who published the so-called hack come here and explain a) what it thinks the hack accomplishes and b) prove that the hack actually does something. And if it really does something, c) provide insight to how that could be prevented. So far I've seen no proof. And I'm not holding my breath.

Same from me. In fact, I suspect that the "exploit" report is fictitious... phoney... probably posted by someone who doesn't like sNews and who wants to scare people away from using sNews. And we know there are a few developers out there in la-la-land that want to discredit sNews all they can for their own personal gain.  :o

I think it is fitting exploit... I tried the code but it did not result in any (sNews 1.7) No need to fear :)

Thanks...

 25 
 on: March 07, 2013, 08:41:23 am 
Started by sibas - Last post by sibas
Good question!

but trully I don't know,
by using memory_get_usage(true) or memory_get_peak_usage(true)
I get slight higher values but are more stable, memory does not jump up and down so much..

BUT I found another solution  :D , is "easy" fast and working very smooth according to my tests ;D

here is my small solution for a page "f.a.q" where I want this page to display it in multi languages

lang EN
Quote
$l['faqPage']='/lang/faq_page_EN.txt';

lang GR
Quote
$l['faqPage']='/lang/faq_page_GR.txt';

and so on...

add this small function to snews, or page
Code: [Select]
function loadFaq(){
readfile(dirname(__FILE__).l('faqPage'));
}

create a new page and add
[func]loadFaq:|:[/func]

voila!!!!!

and according to PHP manual
Quote
readfile() will not present any memory issues, even when sending large files, on its own. If you encounter an out of memory error ensure that output buffering is off with ob_get_level().
   ;D ;D ;D


 26 
 on: March 06, 2013, 08:49:23 pm 
Started by sibas - Last post by Keyrocks
Interesting results. It appears memory usage isn't increasing incrementally by changing languages.
Are these numbers server memory use or page-loading memory used in the browser?

 27 
 on: March 05, 2013, 04:26:01 pm 
Started by sibas - Last post by sibas
Thanks dudes, I try memory_get_usage
and the results is something like this

first page                      memory = 1217008
click in one article           memory = 1234784
reload the same article    memory = 1234992
change the language       memory = 1254952
go to one category         memory = 1230216
change again language    memory = 1213056
home again                    memory = 1218376
change again language    memory = 1238232
change again language    memory = 1219744

I will make some script to write the memory in every page to have better results,
and if stay like this I think I don't have any problem..

 28 
 on: March 05, 2013, 12:35:57 pm 
Started by sibas - Last post by nukpana
Hey nukpana
The most of the data is plain text, although they are few parts where contains <p> <strong> <em> <br />

The biggest $l['']part is 932 bytes and have also 5 <li></li>

Now even if I remove all the html tags the amount of those html data is negligible,
my question still exists

here is my problem, now I use only two lang files and the total amount of the files I think is going to be 200000 bytes

but what if I use for example 5 languages EN GR NL FR ES and each of them are 120000 bytes the total is 600000 bytes….  ::)
Of course are not loaded all but only when someone switch the languages

What I'm trying to figure out is if it is possible for the memory management of so large lang files.
In localhost tests I don’t see problems and everything is working OK  :-\

I just wanted to make sure you weren't duplicating your database to language strings or loading data within the language string as JSON.  Me personally, would still remove the HTML bits from the language string.

I wouldn't worry.  Only one file will be held in memory, not all of the files. You can always check by using memory_get_usage();

 29 
 on: March 03, 2013, 05:45:47 pm 
Started by sibas - Last post by Keyrocks
I don't see this as a big memory management problem. Several Content Management applications whitten in PHP have many, many PHP files, not only for language variables (in several different languages) but for plug-ins or modules and core functionality.

PHP files are all loaded and read (parsed) by Apache (on servers running Linux operating systems) by the host server on the host server. We refer to this as server-side and the PHP files - collectively - generate the source code sent from the server to a user's browser.

sNews - index.php loads first
The first file Apache (the Linux server's server app) looks for and loads is always the index file.
Apache's httpd.conf file (in my localhost installation) is configured to search for an index file in this order:
1.   index.php
2.   index.php4
3.   index.php3
4.   index.cgi
5.   index.pl
6.   index.html
7.   index.htm
8.   index.shtml
9.   index.phtml

The sNews package uses the index.php file in its directory root, so Apache finds and loads it first.
We include the snews.php file (with all of the PHP functions) at the top of the index.php file, so all of the sNews site's PHP gets loaded before the rest of the index.php file gets parsed by the server.

Only one language variable file - the one used by the site by default (as defined in the website's setting panel) will be loaded and parsed (on the host server) when a user visits the website. If the user chooses a different language while visiting the site, then the server (Apache) will find that file and include it.

At least that's the way I understand it.  :)

 30 
 on: March 03, 2013, 05:10:20 pm 
Started by sibas - Last post by sibas
Hey nukpana
The most of the data is plain text, although they are few parts where contains <p> <strong> <em> <br />

The biggest $l['']part is 932 bytes and have also 5 <li></li>

Now even if I remove all the html tags the amount of those html data is negligible,
my question still exists

here is my problem, now I use only two lang files and the total amount of the files I think is going to be 200000 bytes

but what if I use for example 5 languages EN GR NL FR ES and each of them are 120000 bytes the total is 600000 bytes….  ::)
Of course are not loaded all but only when someone switch the languages

What I'm trying to figure out is if it is possible for the memory management of so large lang files.
In localhost tests I don’t see problems and everything is working OK  :-\

Pages: 1 2 [3] 4 5 ... 10