Gzip output script works with Perl-CGI but gives error with Perl-FCGI..

#1
Hi there,

I was trying to install popular perl based forum YaBB on a Litespeed installation. It worked well when I was running Perl in CGI mode but started giving blank pages when I switched to FCGI using lsperld. After some debugging I found out that its the GZIP code which is causing this and wrote a simple script to reproduce the error:

Code:
#!/usr/bin/perl --

$output = "Hello World.";
open(GZIP, "| gzip -f");
$| = 1;

print "Content-Encoding: gzip\nContent-type: text/html\n\n";
print GZIP $output;
close(GZIP);
exit;
This script works fine and outputs "Hello World." correctly in browser in CGI mode. But when switching to FCGI, it shows all blank and I can see "gzip: stdout: Bad file descriptor" in stderr.log

Is there anyway to rectify this?

-Regards,
Akash
 

mistwang

LiteSpeed Staff
#2
Not a perl expert here, just a guess.
the test code may interfere with the FCGI file handle 0 or 1, so it breaks the FCGI protocol. It is not a problem with CGI though.
And it is really a bad idea to start another process for the compression, should use a Perl internal module if available. Or, not to compress it at all and let web server to do it.
 
#3
Not a perl expert here, just a guess.
the test code may interfere with the FCGI file handle 0 or 1, so it breaks the FCGI protocol. It is not a problem with CGI though.
And it is really a bad idea to start another process for the compression, should use a Perl internal module if available. Or, not to compress it at all and let web server to do it.
I am also not a Perl expert either. Simply picked the code from YaBB which is very old and very popular forum package in Perl. My point here was not regarding the quality of code. I simply wanted to know why CGI is able to do it while FCGI cannot. May be your guess explains that.

But generally speaking, wouldn't this break those Perl scripts which fork processes that output directly to stdout?

-Regards,
Akash
 
Top