I'm so confused...

Lee

Well-Known Member
#1
Let me start by making it clear: I DO NOT know anything about cache and how it works. I just want to warm the cache so my customers have the best experience they can and I do not want to spend time anymore manually warming the cache manually. I'm tired...

I've had a hard time getting the cache to work the way I 'Think' it should work. I have to warm up the cache manually by going through all the menu items both on the iPhone and the Desktop!
  1. I delete all the files in the lscache directory
  2. I run this CLI command: curl -N "https://fragnanimous.com/index.php?route=extension/module/lscache/recache&from=cli"
  3. I look in the directory and there are growing amount of listings, so it appears to be warming the cache
  4. I take one of the URL's that was cached via CLI, I've tried many URL's and no cache hits
  5. I paste the URL in my windows 10 desktop and it shows a cache MISS
  6. I refresh the page and it shows a HIT
This is the same with my iPhone. Running the CLI command does not warmup the cache - period, never has.

Why didn't the CLI command create a cached copy of that URL?

LiteSpeed ESI Feature: Disabled
Seperate View for Logged-in Users: Disabled
Separate View for Mobile Device: Tried Enabled and Disabled
Separate View for Safari Browser: Tried Enabled and Disabled

I tried this .htaccess file:
<IfModule LiteSpeed>
CacheEngine on crawler
RewriteEngine On
RewriteCond %{REQUEST_METHOD} ^HEAD|GET$
RewriteCond %{HTTP_USER_AGENT} "iPhone|iPod|Petal|BlackBerry|Palm|Googlebot-Mobile|Mobile|mobile|mobi|Windows Mobile|Safari Mobile|Android|Opera Mini|Fennec" [NC]
RewriteRule .* - [E=Cache-Control:vary=isMobile]

RewriteCond %{HTTP_USER_AGENT} "bot|compatible|images|cfnetwork|favicon|facebook|crawler|spider|addthis" [NC]
RewriteCond %{HTTP_USER_AGENT} !Chrome [NC]
RewriteCond %{HTTP_USER_AGENT} !Mobile [NC]
RewriteCond %{HTTP_USER_AGENT} !Macintosh [NC]
RewriteRule .* - [E=Cache-Control:vary=isBot]


RewriteCond %{HTTP_USER_AGENT} Bot [NC]
RewriteCond %{HTTP_USER_AGENT} Android [NC]
RewriteCond %{HTTP_USER_AGENT} Chrome [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobilebot]


RewriteCond %{HTTP_USER_AGENT} Macintosh [NC]
RewriteRule .* - [E=Cache-Control:vary=isMac]


RewriteCond %{HTTP_USER_AGENT} "iPhone|iPad|Petal" [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobileapple]


RewriteCond %{HTTP_USER_AGENT} Android [NC]
RewriteCond %{HTTP_USER_AGENT} "Chrome|Firefox|Opera|OPR" [NC]
RewriteCond %{HTTP_USER_AGENT} !Bot [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobile]
### marker WEBP start ###
RewriteCond %{HTTP_ACCEPT} "image/webp" [or]
RewriteCond %{HTTP_USER_AGENT} "Page Speed"
RewriteRule .* - [E=Cache-Control:vary=%{ENV:LSCACHE_VARY_VALUE}+webp]
RewriteCond %{HTTP_USER_AGENT} iPhone.*Version/(\d{2}).*Safari
RewriteCond %1 >13
RewriteRule .* - [E=Cache-Control:vary=%{ENV:LSCACHE_VARY_VALUE}+webp]
### marker WEBP end ###
</IfModule>


And I tried this basic .htaccess file too:
CacheEngine on crawler
RewriteEngine On


RewriteCond %{REQUEST_METHOD} ^HEAD|GET$

PHP Version 7.4
Opencart 3.0.3.7
Journal 3
Shared Server

I've tried every combination of settings and I never get a hit on the first page load.

I really need help.
 

serpent_driver

Well-Known Member
#2
I DO NOT know anything about cache and how it works
You don't have to despair. Cache is not the same as cache, because there are different types of cache. The cache that litespeed uses can be explained very simply. Save a page on your computer with your browser functions and request the saved file and you already have a cache. LiteSpeed basically does the same thing. The files in the /lscache directory correspond to the file that you have saved on your computer with your browser. It's more complex in detail, but this simple example should make you understand the basic principle.

The reason why you don't get requested pages cached with crawler script is simply as well and can be explained just as easily.

Without a cache, the content of each accessed URL is created dynamically. If a page is cached, then this dynamic no longer exists because each page consists only of a statically stored file. That's why there is no more PHP and no more database queries. This is the main reason why a cached file loads much faster. This has advantages, but also disadvantages, especially in applications such as an online shop. The contents of an online shop are not only very dynamic, but often also user-related. The shopping cart is the best example of this. The challenge of making a cache like LiteSpeed usable in an online shop is to compensate for the lack of dynamics. Otherwise you could not use a "page" cache in an online shop. The special thing about the LiteSpeed cache is that it not only creates 1 version of a cached URL, but if necessary an infinite number of versions. For example, if the content of a requested URL changes because the user performs a specific action that sets a cookie, LiteSpeed creates a new version of a cached URL. The LiteSpeed cache is also URL based. If the URL of a requested page changes, e.g. by a $_GET parameter in the URL, the LiteSpeed cache creates a new version for this URL. No setting is required for this, the LiteSpeed Cache does it automatically. The situation is different with cookies or with another device, i.e. mobile devices. Journal 3 has device detection and generates different code depending on the device. Therefore you have to configure the LiteSpeed cache accordingly via .htaccess rules, which you have already done. These so-called cache varies define different cache versions for each type of device, whereby the detection does not take place on the device, but on the so-called user agent. With this user agent, the browser tells the server whether it is a smartphone, a tablet or a desktop computer. The problem is that this crawler script also uses a user agent to warm up the cache. However, this user agent is not defined in the cache varies. The simulated requests of this crawler create cached URLs, but only for the crawler and not for the normal user, because the user agent is a completely different one. This is the reason why the crawler warmup does not create a cache for normal users. This is not a bug or a malfunction. The crawler just isn't designed to use with Journal 3. This also applies to the OC functions: "Separate View for Mobile Device" and "Separate View for Safari Browser". If you weren't using Journal 3, you wouldn't even notice the problem.
 

Lee

Well-Known Member
#3
Thank you for that answer, I really appreciate it.

So, no way to get the cache warmed up and usable with journal 3, what a bummer...

I guess I still have to manually do it :(
 

Lee

Well-Known Member
#6
OK, I went to a sight that gave me my user agent for my desktop computer:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/104.0.0.0 Safari/537.36

I put this in the Rebuild Cache for specific devices/browsers, saved it, emptied cache then rebuilt the cache and it STILL does not give me a hit on the cache when I visit any page on my site. Only visiting manually will give me a cached hit.

It acts like there's a disconnect between my site (journal 3) and LiteSpeed cache. They obviously don't don't know each other exists!
 

serpent_driver

Well-Known Member
#7
I put this in the Rebuild Cache for specific devices/browsers, saved it, emptied cache then rebuilt the cache and it STILL does not give me a hit on the cache when I visit any page on my site. Only visiting manually will give me a cached hit.
Why should that work? If you enter a specific UA that doesn't change the UA of the crawler. The crawler's "name" is still the same, but it could work, if you remove all custom cache varies from .htaccess.
 

Lee

Well-Known Member
#8
Because it's should cache for my exact browser? Just experimenting!
BTW, why would LiteSpeed have 'Rebuild Cache for Specific Browsers' it that wouldn't work for me? Am I missing the point on that field?

I have come across a temp fix that works well for the time being: Octoparse web crawler! I put in my UA, parse every link from my site map and voila! It caches my computer for Chrome, Edge and Firefox.

Next I'm going to create a crawl with the iPhone UA. But it's late so it will be in the morning.

Looking forward for your program to be available so I can put all this nonsense to rest.
 

AndreyPopov

Well-Known Member
#11
I really need help.
first - remove:
Code:
### marker WEBP start ###
RewriteCond %{HTTP_ACCEPT} "image/webp" [or]
RewriteCond %{HTTP_USER_AGENT} "Page Speed"
RewriteRule .* - [E=Cache-Control:vary=%{ENV:LSCACHE_VARY_VALUE}+webp]
RewriteCond %{HTTP_USER_AGENT} iPhone.*Version/(\d{2}).*Safari
RewriteCond %1 >13
RewriteRule .* - [E=Cache-Control:vary=%{ENV:LSCACHE_VARY_VALUE}+webp]
### marker WEBP end ###
from .htaccess

I hope you enable webp support in Journal settings?
because Journal have own algorithm to detect UAs and what images provide.



second - remove:
Code:
RewriteCond %{REQUEST_METHOD} ^HEAD|GET$
RewriteCond %{HTTP_USER_AGENT} "iPhone|iPod|Petal|BlackBerry|Palm|Googlebot-Mobile|Mobile|mobile|mobi|Windows Mobile|Safari Mobile|Android|Opera Mini|Fennec" [NC]
RewriteRule .* - [E=Cache-Control:vary=isMobile]
these are NOT need!!!!!

third - remove:
Code:
CacheEngine on crawler
and add
Code:
CacheLookup on

fourth - I hope you enter appropriate UAs in LSCache GUI in field "Rebuild Cache for specific devices/browsers" before start crawler?
Code:
Mozilla/5.0 (iPhone; CPU iPhone OS 12_2_1 like Mac OS X) AppleWebKit/604.4.7 (KHTML, like Gecko) Version/11.0 Mobile/15C153 Safari/604.1
you need build(warmup cache) for each UAs!!!!

Code:
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.5005.43 Safari/537.36
Mozilla/5.0 (iPhone; CPU iPhone OS 12_2_1 like Mac OS X) AppleWebKit/604.4.7 (KHTML, like Gecko) Version/11.0 Mobile/15C153 Safari/604.1
Mozilla/5.0 (Linux; Android 7.1.2; SM-G988N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/102.0.5005.98 Mobile Safari/537.36
Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/605.1.15 (KHTML, like Gecko) Version/15.0 Safari/605.1.15



P.S. other comments after reading yours posts in this thread later.
now I'm so busy - translate site to Ukrainian.
 
Last edited:

Lee

Well-Known Member
#12
Thank you Andrey, I will make these changes and report back.

After a year of trying this and trying that, the file has all kinds of entries in it now and I have no idea what works and what doesn't!
 

AndreyPopov

Well-Known Member
#13
Thank you Andrey, I will make these changes and report back.

After a year of trying this and trying that, the file has all kinds of entries in it now and I have no idea what works and what doesn't!
REMEMBER: ALL LSCache Rewrite rules must be placed in .htaccess BEFORE any other rules with [L] flag.

I have perfect working configuration ;)

I hope you use latest LSCache plugin especially with this commit for crawler and curl magic option ;)
 

AndreyPopov

Well-Known Member
#16
Also, how to I stop curl from crashing every few minutes when I add two or more UA's in Rebuild Cache?
this is exceed php memory limit :( .

I solve it by my Advanced Crawler mode


Please send me you perfect configuration! I'm tired of messing around...
now I use this:


Code:
### LITESPEED_CACHE_START - Do not remove this line
<IfModule LiteSpeed>
CacheLookup on
## Uncomment the following directives if you has a separate mobile view
RewriteEngine On
## Uncomment the following directives if you has a separate Safari browser view
RewriteCond %{HTTP_USER_AGENT} "bot|compatible|images|cfnetwork|favicon|facebook|crawler|spider|addthis" [NC]
RewriteCond %{HTTP_USER_AGENT} !Chrome [NC]
RewriteCond %{HTTP_USER_AGENT} !Mobile [NC]
RewriteCond %{HTTP_USER_AGENT} !Macintosh [NC]
RewriteRule .* - [E=Cache-Control:vary=isBot]
RewriteCond %{HTTP_USER_AGENT} Bot [NC]
RewriteCond %{HTTP_USER_AGENT} Android [NC]
RewriteCond %{HTTP_USER_AGENT} Chrome [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobilebot]
RewriteCond %{HTTP_USER_AGENT} Macintosh [NC]
RewriteCond %{HTTP_USER_AGENT} !"Chrome|Firefox|Opera|OPR" [NC]
RewriteRule .* - [E=Cache-Control:vary=isMac]
RewriteCond %{HTTP_USER_AGENT} "iPhone|iPad|Petal" [NC]
RewriteCond %{HTTP_USER_AGENT} !"Chrome|Firefox|Opera|OPR|CriOS|FxiOS|OPT" [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobileapple]
RewriteCond %{HTTP_USER_AGENT} "Android|iPhone|iPad" [NC]
RewriteCond %{HTTP_USER_AGENT} "Chrome|Firefox|Opera|OPR|CriOS|FxiOS|OPT" [NC]
RewriteCond %{HTTP_USER_AGENT} !Bot [NC]
RewriteRule .* - [E=Cache-Control:vary=ismobile]
</IfModule>
### LITESPEED_CACHE_END
 

Lee

Well-Known Member
#18
OK, used your .htaccess rules, your program and your UA's and it does warm the cache for my windows desktop but it does not warm cache for my iPhone :(

Same problem I've had from day one. It will never cache for my phone so I have to assume that it doesn't cache for any iPhone.

Now if I use my iPhone and go to a page it will then cache it from that time forward. So Litespeed is caching it, I just cannot warm it up.
 

AndreyPopov

Well-Known Member
#19
OK, used your .htaccess rules, your program and your UA's and it does warm the cache for my windows desktop but it does not warm cache for my iPhone :(
please check, if LSCsche plugin contain commit "Recache now set cache copy cookies according to customized useragents"


because main lscache algorithm for iPhone contain:
$SERVER('LS_CACHE_CTRL') vary=ismobileapple
and
$COOKIE('_lscache_vary') device:mobile

but without commit above crawler NOT set cookie during recache.
 
Top