defense against the dark arts? (Cross site scripting and Cross Site forgery)

I was having a discussion with someone on IRC about how TWiki is vulnerable to Cross-site scripting and Cross-site request forgery, and we realized that there are 2 possible approaches to securing TWiki effectively (both requiring a unique magic number for all URLs):

  1. add a pre process to the TWiki::UI system, requiring a valid and unique magic, and a post process step between rendering and output to the browser
  2. use a small proxy system between TWiki and browsers to add and validate the magic
  1. is actually still risky as all scripts still are able to output directly to the browser using a =print= statement, thus giving the user urls that may not have a necessary magic in the url, or similarly for AddOns that persist in not using resthandlers.

whereas 2. abstracts the security from the application server, in much the same way as it is for ssl – goodness all round.

So – I wonder if there is such a proxy already?

There are also massive performance reasons why you should always have a proxy between browsers and heavy application servers like TWiki – this too could do with filling out.Securing TWiki is not as simple as converting all actions to POST (ie using proper REST / HTTP) because there are too many legacy conveniences, allowing GET URL’s to act upon the data. But, by delegating the securing of the transactions to an external wrapper, I think we can avoid these flaws.
see Wikipedia on Cross Site Scripting and Cross-site request forgery