LSCache expired early than cache PublicTLL settings!?!?!?! help explain and understand [SOLVED]

AndreyPopov

Well-Known Member
#1
OpenLiteSpeed 1.7.14
Opencart 3.0.3.1
Journal Theme 3.1.8
LSCache plug-in 2.3.0
https://www.priazha-shop.com/

store contains 8200 products

for each product
/index.php?route=product/product&product_id=41
/index.php?route=product/product&path=20_27&product_id=41
/index.php?route=product/product&manufacturer_id=8&product_id=41
Journal Theme adds
/index.php?route=journal3/product&product_id=41&popup=quickview (QuickView popup)
/index.php?route=journal3/product&product_id=41&popup=options&product_quantity=1& (choose options price)

8200*5=40100 requests per UserAgent (desktop or mobile or safari browser views)

I disable in crawler algorithm
PHP:
 //usleep(round($diff));
//echo $diff . ' microseconds for one run' . PHP_EOL;
to increase crawler speed

but on my hosting only around 600 requests per hour completed.
40100/600 = 66 hours!!! 3 days!!!!

full recache of site is 3 days per UserAgent.

I use advanced crawler algorithm and crontab tasks to complete full recache.

Desktop View
Tablet View
Mobile View
Desktop View Safari (required separate cache because Safari not support webp image format)
Tablet View Safari (required separate cache because Safari not support webp image format)
Mobile View Safari (required separate cache because Safari not support webp image format)
Desktop Bot (required separate cache because not accept cookies)
Mobile Bot (required separate cache because not accept cookies)

8*3 = 24 days

in my conditions real time is 18 days for full recache all UserAgents.

by default PublicCache TTL in settings is 1200000 - 13.9 days
I set PublicCache TTL in LSCache settings to 1814400 (21 days)

BUT..............

I find that lscache expired after 7-10 days !!!! :(
and I must again-again-again and again start full recache - infinitive process :(


in docs I read:
https://openlitespeed.org/kb/litespeed-cache-on-openlitespeed-without-plugins/

Notes
OpenLiteSpeed outputs a response header X-LiteSpeed-Cache: hit if a request is served from public cache.

OpenLiteSpeed outputs a response header X-LiteSpeed-Cache: hit,private if a request is served from private cache.

The LSCache hit rate is calculated based on all files served. Many of the files served by LSWS, like CSS or HTML, are intentionally not cached by LSCache. Because these files are included in the LSCache hit rate calculation, the hit rate may sometimes look much lower than one might expect.

You will not see a Cache-Control:max-age header when you use LSCache. Please be aware there are two different concepts: Cache-Control:max-age is a browser cache header. It is not for server-side cache. x-litespeed-cache-control: public,max-age=86400 is an LSCache control header, which will be seen when the cache plugin being used. When using a rewrite rule like [E=cache-control:max-age=120]to enable cache as instructed in this wiki, you won’t see the x-litespeed-cache-control header.

LScahe plug-in used .htaccess rewrite rules to separate caches.

if page not cached in header:
x-litespeed-cache-control: public,max-age=1814400
x-litespeed-cache: miss

when page ached in header only:
x-litespeed-cache: hit


please help explain an understand WHAT rules must be in .htaccess that lscache expired only after 1814400 secs (21 days) !!!!!!!!!


============================================
SOLUTION:

- my hoster store lscache in his system space
- and delete(clean) lscache folder when he wants (because space overflow). unexpected for me
- hard and long discussion with hoster support and developer with pointing to OLS LSCache Docs to store lscache in different folders
- hoster enable possibility to store lscache in my account file system
I'm happy 10 days :) lscache after crawling work
 
Last edited:

serpent_driver

Well-Known Member
#2
First of all: It is not really economically to try to warmup the cache for all URLs. Reduce the number to the most wanted URLs if you don't have a method (crawler) to request URLs parallel and not one by one like you do it with the current method.
 

AndreyPopov

Well-Known Member
#3
First of all: It is not really economically to try to warmup the cache for all URLs.
really?!?!?!?! really?!?!?!?!

this NOT solve problem!

if you don't have a method (crawler) to request URLs parallel
I already made advanced crawler for separate recache products, categories, manufacturers, catalog
and start crawler by cron from cli (curl)

but problem not in this!

problem: lscache expired early than even 10 days!!!!
 
Last edited:

serpent_driver

Well-Known Member
#4
Yes really! ;) If no one requests a never wanted product in your shop, why should the cache of this URL be warmed up?! This is waste of resources.

It is also not necessary to warmup the cache for bots. You don't get any advantage.
 

AndreyPopov

Well-Known Member
#5
It is also not necessary to warmup the cache for bots. You don't get any advantage.
really?

I have GREEN zone on Google Search Console
gsc_green.jpg

and effective 99% urls!!!
gsc_effective.jpg

without recache ORANGE or RED zones!

but recache for Google bots one day - max 2 days for all site.

bots not problem!!!!!

main problem in Desktop, Mobile and Apple UAs!!!




If no one requests a never wanted product in your shop, why should the cache of this URL be warmed up?! This is waste of resources.
never wanted? this is yarn site - today red color not wanted, tomorrow red color is most wanted!!!
today summer light yarn not wanted, tomorrow the trend is light cotton hats!
 
Last edited:

serpent_driver

Well-Known Member
#6
Google doesn't measure the speed of your page with a bot. It uses client side results from Chrome browser and such data are used to calculate the speed.

Really? Yes, really! ;)
 

AndreyPopov

Well-Known Member
#7
If no one requests a never wanted product in your shop, why should the cache of this URL be warmed up?! This is waste of resources.
if you sure in this - make like you want.

if I not recache for google bots then orange and red zones in GSC.


all your sentences NOT help solve problem: how to reduce lscache expiration time of OpenLiteSpeed!
 

AndreyPopov

Well-Known Member
#10
I don't tell you fairy tales.... Believe in what ever you want to believe in.... ;)
ok.

fact is fact: until I recache for google bots site, in GSC I see only orange and red columns.
and once again - problem not in recache for bots. recache for bots one day or maximum 1.5 - 2 days

problem in other UAs - 3-4 days for full recache and after 3 days I must again start recache.
 

AndreyPopov

Well-Known Member
#11
Which method or crawler do you use to warmup the cache?
I use lscache internal crawler
https://github.com/litespeedtech/lscache-opencart


Cli Command for Rebuild All Cache
Run the following command at the website host:

curl -N "http://yoursite/index.php?route=extension/module/lscache/recache&from=cli"

but I add:
- &what=recache (for separate recache)
- and modify algorithm of crawler:
a) crawler by default cannot recache more than 600-800 urls per hour, I made to remember last recached position
b) store full urls list in DB, not in php memory. that's prevent exceed php memory limit.
 

serpent_driver

Well-Known Member
#14
Reducing the number of URLs is a general advice and isn't meant for to solve your problem.

Can you find out any different behaviour if a URL has more than 1 GET or only 1 parameter?
 

AndreyPopov

Well-Known Member
#17
1 GET Parameter: &product_id=41
Multiple GET Parameter: &path=20_27&product_id=41
1. this answer for question: WHY lscache expired early than sets by cache Purlic TTL settings?

2. different urls - different cache!

also like /index.php?route=product/category&path=20&page=2 and /index.php?route=product/category&path=20&page=3
 

serpent_driver

Well-Known Member
#18
Different cache for different URLs is not what my question was intended for. You say you have issues with cache lifetime and the cache lifetime is different as you defined it, right?
 
Top