Widget scheme

Rick Jelliffe rjelliffe at allette.com.au
Fri Nov 13 08:07:54 CET 2009


MURATA Makoto (FAMILY Given) wrote:
>  
> Second, we have to remember that the pack scheme is merely provisional.  
> There is a reason.  When it was first proposed, Roy Fielding and others at 
> the uri-review at ietf.org ML were very against. 
>
> http://www.ietf.org/mail-archive/web/uri-review/current/msg00545.html
> http://www.ietf.org/mail-archive/web/uri-review/current/msg00548.html
>   
I think Fielding is saying we don't need a local handler for pack:
    pack:http://www.eg.com/myfile.docx!part1.xml  
         which 1) splits up the url into parts 2) retrieves myfile.docx 
then 3) looks inside for part1.xml
when we could have
   http://www.eg.com/myfile/part1.xml 
          which 1) returns file.docx and invokes an OPC handler (like a 
cache)
                    2) looks inside that for ./part1.xml   (or looks 
inside for a catalog to resolve)
and therefore the pack scheme is irrelevant and positively bad, because 
it forces encoding of
the intermediate details etc.

This is plausible. However the trouble I have with it is that you need a 
mapping at the server side to say
"when a request is made for http://www.eg.com/myfile/* then return 
http://www.eg.com/myfile.docx"
which seems impractical: is an information producer supposed to alter 
the UTL rewriter everytime
they add a new DOCX document to their website?  I suppose the idea is 
that the server has some kinds
of URL rewriting rules built in, but I don't think know that web servers 
currently ship with the kinds
of URL rewriting capabilities (e.g. the Tuckey URL rewriter).  So maybe 
I have the wrong end of the stick
on this one, but I don't find Fielding's suggestion very satisfactory 
for that reason: you need to have
specialist handlers attached to both server and client.

One way to look at this is to ask whether Fielding's recommendation 
actually has worked. Or has it
effectively prevented packaging, in favour of online resources in files 
or file systems?

In other words, if this solution was worked through 13 years ago 
satisfactorily, why are we still
asking this question?

The web does have packages: web applications (e.g. .WAR) and perhaps 
MIME email multipart.
And I do think it is entirely likely and desirable that our file formats 
and applications will become
like little web sites/servlets, which is why I think preventing conflict 
at the package level is useful. 
But no-one downloads  .WAR  now, they get installed behind the scenes. 
And MIME multipart
email get read by mail readers: any URI syntax is just window-dressing. 

So I  it looks a little to me that Fielding's claims that the Web 
architecture can handle packages is
based on theoretical assertions rather than being derived from reality. 
I am not saying that he didn't
derive the Web architecture from reality, just that this reality didn't 
then and doesn't now include
actual packages.  Of course, one purpose of inducing the Web 
architecture is that the principles can
then guide future development.

Cheers
Rick Jelliffe


More information about the sc34wg4 mailing list