Previously (http://fsanglier.blogspot.com/2008/06/alui-publisher-part-2-increase.html) , I've been talking (and proving I hope) that using a "no redirect" mechanism for serving published content from publisher is the best option to enable portal caching. Publisher 6.4 offers already such a possibility (although not publicized a lot): using in the portal Published Content Web Service object published_content_noredirect.jsp instead of the standard published_content_redirect.jsp.
Unfortunately, if you start using this, you are going to start seeing a weird behavior: the publish content is getting truncated in some special cases...and this is due to the way the JSP has been coded. Several options for you: either you wait for a Critical fix to be issued to you by BEA (i am not aware of one yet), or you upgrade to ALUI 6.5 (I hear that this has been fixed in 6.5...have not verified though), or you simply do it yourself, as this is a simple fix to implement (ultimately, that might be the same type of code that would be issued by a CF I imagine)
By looking at the JSP within the publisher web application archive (ptcs.war - explode the war using jar command), we can see what's wrong and why the content is truncated in some case:
HttpURLConnection conn = (HttpURLConnection)url.openConnection();
// make the request
conn.connect();
//read the content length
int contentLength = conn.getContentLength();
//if there is content, forward to the requesting client
if( contentLength > 0 ){
// UTF-8 is necessary
InputStreamReader isr = new InputStreamReader(conn.getInputStream(), "UTF-8");
char[] content = new char[contentLength];
isr.read(content);
isr.close();
out.write(content);
}
As you can see, an HTTP GET request is made, and the content length of the response is gotten from the "getContentLength()" method. This call is going to get the content length number fro mthe response header rather than actually count all the bytes that are contained in the response content. Thus, since the code base itself on this number to output the content to the JSP output stream (see above: char array of length equal to contentlength), the content will indeed be truncated if the contentlength number is not correct...
A simple correction (and more robust code) is actually to make sure ALL the content is pushed to the output stream, independently from the contentlength number returned by the response header. Here is my code below that fixes that issue, and also increase performance by using the preferred BufferedReader wrapper class instead of the bare InputStreamReader:
------EDITED 3/12/2009--------
BufferedReader bisr = null;
try {
bisr = new BufferedReader(new InputStreamReader(conn.getInputStream(), "UTF-8"));
String line;
while ( (line = bisr.readLine( ) ) != null ) {
out.println(line);
}
}
catch(Exception exc){
throw exc; //to be caught by the global try catch
} finally {
if(bisr != null)
bisr.close();
bisr = null;
}
return;
------END EDITED 3/12/2009--------
Basically, the code will read by chunks of 2000 chars (this is a chunk size that I think is appropriate) the entirety of the content until the last character...and write it all to the output stream...This does not rely on the contentlength at all, and thus is more reliable and robust.
After changing the published_content_noredirect.jsp as above, you can repackage the ptcs.war with the new corrected JSP (within the root of the previously extracted ptcs.war folder, run jar -cvf ptcs.war * command) and redeploy to ALL redirector and publisher instances...
Voila, you have your perfect solution for ALUI 6.1 and Publisher 6.4 (and previous versions too).