I wonder if Docker can replace Puppet.

I’m curious to see how hard it would be to push out Docker versioned configuration changesets over ssh to ‘anywhere’, with some kind of idempotency via system ‘tags’.

I’ve finally spent a little time playing with Docker, and to be honest, the really simple

here’s a list of commands that get run to set up the image

feels awesome.

to test it out, I wrote the simplest steps I could think of to create a working foswiki installation into a Dockerfile:

FROM ubuntu
MAINTAINER    Sven Dowideit <svendowideit@home.org.au>

RUN echo deb http://fosiki.com/Foswiki_debian/ stable main contrib > /etc/apt/sources.
list.d/fosiki.list
RUN echo deb http://archive.ubuntu.com/ubuntu precise main restricted universe multive
rse >> /etc/apt/sources.list
RUN gpg –keyserver the.earth.li –recv-keys 379393E0AAEE96F6
RUN apt-key add //.gnupg/pubring.gpg
RUN apt-get update
RUN apt-get install -y foswiki

#create the tmp dir
RUN mkdir /var/lib/foswiki/working/tmp
RUN chmod 777 /var/lib/foswiki/working/tmp
#TODO: randomise the admin pwd..
RUN htpasswd -cb /var/lib/foswiki/data/.htpasswd admin admin
RUN mv /etc/foswiki/LocalSite.cfg /etc/foswiki/LocalSite.cfg.orig
RUN grep –invert-match {Password} /etc/foswiki/LocalSite.cfg.orig > /etc/foswiki/Loca
lSite.cfg
RUN chown www-data:www-data /etc/foswiki/LocalSite.cfg

RUN bash -c ‘echo “/usr/sbin/apachectl start” >> /.bashrc’
RUN bash -c ‘echo “echo foswiki configure admin user password is ‘admin'” >> /.bashrc’

EXPOSE 80

and then I can create the image with a simple:

docker build -t svendowideit/ubuntu-foswiki .

and run that image by calling:

docker run -t -i -p 8888:80 svendowideit/ubuntu-foswiki /bin/bash

Which (assuming that port 8888 is unused on my host computer) means I can do some testing by pointing my web client to http://localhost:8888/foswiki

When I exit the bash shell, which allows me to debug what is happening, everything is shutdown, and all changes are lost. If I make changes, I can commit them, but at this point, I prefer to make a new Dockerfile.

The interesting thing is that Docker seems to create an image tag for every command, so if I make add some RUN lines, or make changes, it doesn’t need to re-do steps that it has done before…. which sounds to me just like Rex, Puppet, Ansible etc – but more re-useable.

And so, I’m curious to see how hard it would be to push out Docker versioned configuration changesets over ssh to ‘anywhere’, with some kind of impotency via system ‘tags’.

 

PS, the docker image is available from https://index.docker.io/u/svendowideit/ubuntu-foswiki/ , and uses my debian packages, so you should install new extensions using apt-get install

inkChilli wiki to Sharepoint migration tool

If you need to convert from TWiki or Foswiki to Sharepoint (2010 and 2013 now, 2007 in a few weeks time when I’ve tested it), please head over to http://inkChilli.com, or email me

2 months ago I was asked for help migrating a company’s TWiki site to Sharepoint, and then again a few days later from someone who’s company had declared that they needed to shift their foswiki information to Sharepoint too.

In response, I spent the 2 months researching through much of Microsoft’s undocumented API’s, and implementing a tool to do the job.

If you need to convert from TWiki or Foswiki to Sharepoint (2007, 2010, 2012 and Office 365), please head over to http://inkChilli.com, or email me

 

 

HTML5 & RDFa-lite semantic templating engines

If you’re writing a webapp using rdfa-lite, It makes sense to consider using rdfa markers in the DOM based template engine.

While working on round-trip html5 <-> Rdfa-lite query and editing, I’ve been struck by the primitive feeling of html template languages. {{mustache}} ,  <%escapes%> <–%ssi escapes %–> all work the same way as C preprocessor macros – string concatenation style – which ignores the underlying structure that we’re working with.

Thankfully, HTML5 is changing everything – it has / will have a template tag

Along the way to finding that, I came across Weld.js (dead?), Transparency or Plates and Pure and not forgetting Knockout.js.

None of them quite goes where I want – as they either use class/id, or their own data- attributes.

What I’m after, is leveraging the already existing semantic annotations of existing (or template tag) elements to add new ones.

For example, I might have the following Link menu, and then want to dynamically add others from a remote query

     <ul id="page-list">
         <template id="page-item-template" style="display:none;">
            <li  typeof="WebPage" resource="/">
                <a property="url" href="/" tabindex="-1" id="index">
                    <span property="name">Home</span></a>
             </li>
          </template>
             <li typeof="WebPage" resource="/">
                  <a property="url" href="/" tabindex="-1" id="index">
                    <span property="name">Home</span></a>
             </li>
             <li  typeof="WebPage" resource="/SvenDowideit.html">
                  <a property="url" href="/SvenDowideit.html" tabindex="-1" id="SvenDowideit">
                    <span property="name">Sven Dowideit</span></a>
             </li>
        </ul>

The following works for browsers that don’t support the new HTML5 template tag:

var node = document.querySelector('#page-list [typeof=WebPage]').cloneNode(true);
node.querySelector('[property=name]').textContent = 'TODO';
node.querySelector('[property=url]').href = '/TODO.html';
document.querySelector('#page-list').appendChild(node);

A nice start, but to me, there’s still something not right with the addressing scheme –
which is where some of the above template engines use the @ symbol to denote that the value should be set on the named attribute.

something like this might work

var node = document.querySelector('#page-list [typeof=WebPage]').cloneNode(true);
node.render( {
    '[property=name]@textContent':  'TODO',
    '[property=url]@href': '/TODO.html'
});
document.querySelector('#page-list').appendChild(node);
but setting up the relationships beforehand, and making the clone implicit
var template = getTemplate('#page-list [typeof=WebPage]', {
   name: '[property=name]@textContent',
   url:  '[property=url]@href'
});
document.appendChild(template([
 {name: 'Sven Dowideit', url: '/SvenDowideit.html'},
 {name: 'TODO', url: '/TODO.html'}
]);

Which looks an awful lot like Weld.js’ (and the inverse of Pure?) API. And, should be trivial to implement using Transparancy.

in the process of thinking it through, I wrote a simplistic version that deserves replacing when I’m thinking about something else:

        getTemplate: function(templateSelector, map) {
            if (this.Template === undefined) {
                var template = document.querySelector(templateSelector);
                var template_map = {};

                for (var key in map) {
                    var address = map[key].match(/^([^@]*)@?(.*)?$/);
                    template_map[key] = {
                        node: address[1] || '*',
                        attr: address[2] || 'textContent'
                    }
                }                
                this.Template = function(values) {
                    var new_elements = [];
                    for (var idx in values) {
                        var elem = template.cloneNode(true)                
                        for (var key in values[idx]) {
                            if (template_map[key] !== undefined) {
                                var node =  elem.querySelector(template_map[key].node);
                                //TODO: detect if key is a method, and call it?
                                if (template_map[key].attr === 'textContent') {
                                    node.textContent = values[idx][key];
                                } else {
                                    node.setAttribute(template_map[key].attr, values[idx][key]);
                                }
                            }
                        }
                        //TODO: if it where a real template tag, apparently there would be an elem.content
                        new_elements.push(elem.children[0]);
                    }
                    return new_elements;
                };
            }
            return this.Template;
        }}

TodoMVC written using backbone-forms

I’ve been working on some bootstrap instant edit userinterface ideas, and while integrating hallo.js, I was reading about backbone.js and the VIE create.js RDF form generators.

This lead me to backbone-forms, which auto-creates the backbone view from a model schema…

So to try it out, I wrote a TodoMVC example using backbone-forms – moving almost the entire code into the view.

MongoDB had a server-side query JOIN

When I developed the Foswiki MongoDB integration, I worked out a really ‘nifty’ way to do cross database and collection JOINs .

When I developed the Foswiki MongoDB integration, I worked out a really ‘nifty’ way to do cross database and collection JOINs .

but, it’s finally been broken in MongoDB 2.2 – with the removal of the global lock.

So we’re going to have to restrict the foswiki MongoDB plugin to version 2.0.

 

So, you’re curious?

On reason I decided to use MongoDB as a target for foswiki adhoc queries, was the backup of writing $where clauses in javascript. I’ve used them when the Perl-isms in foswiki’s query results could not be magically matched – string and number duplicity for example.

and one facility that MongoDB’s javascript has, is to call db.getSisterDB(‘someotherdbname’);

so for (a very simplified) example:

db.current.find({$where : "db.getSisterDB(this.otherDB).current.findOne({topic: this.otherTopic}).value == 'what are you looking at'"});

yes, this is not SELECT JOIN, just WHERE JOIN – but it’s exactly what we needed.

 

 

Foswiki 1.1.5 released – rpms, debs and usbstick ready

Foswiki 1.1.5 released – rpms, debs and usbstick ready

George has been leading the charge to a major bug fixing release of foswiki – we’ve resolved over 120 issues, and worked hard to improve security – dealing with some interesting cross site scripting issues found by ‘SonyStyles’, and then pushing on to harden the registration process to deal with spammers.

foswiki’s password system can now migrate your user’s password store to more modern encryption methods – the default that we shipped with Twiki can thus move from crypt to md5-apache.

4 days after the release, the installation and maintenance options for 1.1.5 have improved too:

  1. my yum package repository (extensions too)
  2. my debian package repository (extensions too)
  3. my Foswiki on a USB stick for Windows
  4. Oliver’s VirtualMachine

More Apache conf magic, this time for foswiki

More Apache conf magic, this time for foswiki

Last month, I’ve needed to diagnose 2 issues with a foswiki installation.

The first is the constant issue of pinpointing performance problems, the second with session persistence not persisting.

Both of these needed some form of logging to track when and to whom they were happening, so I figured the easiest thing to do was to use Apache to log what I needed.

Performance monitoring

Apache can log ‘The time taken to serve the request, in microseconds.’, and it can log HTTP response header values. So I added a little code to the foswiki installation to output a HiRes timer of how long it took to render the request, and set up my log as:

#add a 'performance' log
LogFormat  "%h %l %{SCRIPT_URI}e%q %u %t %>s %Ts (%DuS) foswiki: %{X-Foswiki-Monitor-Rendertime}o " performance
CustomLog logs/performance_log performance

Using this log, we can compare configuration changes and loads vs both perl execution times and (it seems) some measure of communication times.

Session Cookie logging

In this foswiki’s case, there was a mix of http/https, ipv4/ipv6, Client SSL Certificates and hotfixed RewriteRules that I was suspicious of. So given that it worked for my connections more often than not, I wondered if there were conflicts of session cookies between ssl and non-ssl, or something more insidious.

So I started logging session cookies (guid’s)

#add a 'session cookies and strikeone' log
LogFormat  "%h %{HTTP_HOST}e %>s \"%r\" %{pid}P \"%{SSL_CLIENT_S_DN_CN}e\" %{FOSWIKISID}C %{SFOSWIKISID}C %{FOSWIKISTRIKEONE}C " session
CustomLog logs/session_log session

In both cases, these log files let me pinpoint what the problem was not – and then have that inspiration that fixed the worst of it.

 

X-Foswiki-Monitor-renderTime patch

I’ll either add this to foswiki 1.2.0, or make a plugin for it, but if you want to see how long things take to render, apply this patch:

NOTE: you will need to install the Time::HiRes CPAN library

diff --git a/core/lib/Foswiki.pm b/core/lib/Foswiki.pm
index 4771f71..d26bd80 100644
--- a/core/lib/Foswiki.pm
+++ b/core/lib/Foswiki.pm
@@ -838,6 +838,9 @@ BOGUS
         }
     }

+    $this->{response}->pushHeader( 'X-Foswiki-Monitor-renderTime',
+        $this->{request}->getTime() );
+        
     $this->generateHTTPHeaders( $pageType, $contentType, $text, $cachedPage );

     # SMELL: null operation. the http headers are written out
diff --git a/core/lib/Foswiki/Request.pm b/core/lib/Foswiki/Request.pm
index 2ce2e15..a06af69 100644
--- a/core/lib/Foswiki/Request.pm
+++ b/core/lib/Foswiki/Request.pm
@@ -36,6 +36,14 @@ use Assert;
 use Error    ();
 use IO::File ();
 use CGI::Util qw(rearrange);
+use Time::HiRes ();
+
+sub getTime {
+    my $this = shift;
+    my $endTime = [Time::HiRes::gettimeofday];
+    my $timeDiff = Time::HiRes::tv_interval( $this->{start_time}, $endTime );
+    return $timeDiff;
+}

 =begin TML

@@ -69,6 +77,7 @@ sub new {
         remote_user    => undef,
         secure         => 0,
         server_port    => undef,
+        start_time     => [Time::HiRes::gettimeofday],
         uploads        => {},
         uri            => '',
     };

 

 

 

Time::HiRes

Zero overhead Client-Side error logging with Apache

Thanks to a tweet by a Brisbane local (Bruce) I was continuing to mull the disconnect I have from external web traffic tracking tools. I prefer to reduce the number of requests needed to serve my users – and zero requests are always fastest.

I’ve been messing with Apache CustomLog formats to debug session and performance issues in foswiki, and so given a hammer, wondered why not apply it to more things.

Then comes Bruce’s link to Client-Side Error Logging With Google Analytics, a continuation of You Really Should Log Client-Side Errors – and I wondered…

What if I put the client error into the next user request made to the server?

the javascript:

function logError(details) {
    $.cookie('clientError', details);
}
window.onerror = function(message, file, line) {
  logError(file + ':' + line + '\n\n' + message);
};
$(document).ajaxError(function(e, xhr, settings) {
  logError(settings.url + ':' + xhr.status + '\n\n' + xhr.responseText);
});
$.cookie('clientError', null);

the apache CustomLog settings:

#add a Client Error log
 LogFormat  "%h %l \"%r\" %u %t %>s %{clientError}C" clientError
 CustomLog ${APACHE_LOG_DIR}/clientError_local_log clientError

and the result:

192.168.1.51 - "GET /~sven/core/pub/System/JQueryPlugin/plugins/foswiki/jquery.foswiki.js?version=2.01 HTTP/1.1" - [07/Apr/2012:16:18:26 +1000] 304 -
 192.168.1.51 - "GET /~sven/core/pub/System/TwistyPlugin/jquery.twisty.js?version=1.6.0 HTTP/1.1" - [07/Apr/2012:16:18:26 +1000] 304 -
 192.168.1.51 - "GET /~sven/core/bin/view/Sandbox/TestClientSideLogging HTTP/1.1" - [07/Apr/2012:16:18:31 +1000] 200 http%3A%2F%2F192.168.1.51%2F~sven%2Fcore%2Fbin%2Fview%2FSandbox%2FTestClientSideLogging%3A1%0A%0Acall_me%20is%20not%20defined
 192.168.1.51 - "GET /~sven/core/pub/System/TwistyPlugin/twisty.css?version=1.6.0 HTTP/1.1" - [07/Apr/2012:16:18:32 +1000] 304 -
 192.168.1.51 - "GET /~sven/core/pub/System/JQueryPlugin/plugins/livequery/jquery.livequery.js?version=1.1.1 HTTP/1.1" - [07/Apr/2012:16:18:32 +1000] 304 -

This way we get super fast, no extra traffic client error tracking.

 

(If someone has the apache-foo to get the right SetEnvIF or RewriteCond to only log when an error is defined, please help – I tried, but failed.)

down the https rabbit hole: foswiki irc support is awesome.

I’ve spent time trying to work out why a client foswiki setup was slow last month.

http requests to the view script were taking in the order of 1second to respond (not good, but bearable given the virtual machine setup) ~ whereas https requests were taking around 5 times as long.

The connect times for https where an eye opener – sometimes it took up to 500mS.

After poking all sorts of options, spending time profiling foswiki, and generally assuming we had something wrong, I got to discussing the issue with Paul on the #foswiki irc channel, and he asked some interesting questions – the most notable

Are the host names in the /etc/hosts file?

and the answer – nope (that’ll teach me for not reviewing the basic server setup 🙂 ).

So I added them, and suddenly, the performance of the https connections became alot more inline with the http times.

So if you have a problem that is vaguely related to your foswiki / twiki – come brainstorm with us on irc, in the foswiki support web, or on the mailing list – you just never know when someone will ask the right odd question 🙂

Centos yum install foswiki and Debian apt-get install foswiki

Thats right, on Redhat Enterprise and Centos, its now just as easy to install foswiki and its ~300 plugins as it is to do so on Debian.

That’s right, on Redhat Enterprise and Centos, it’s now just as easy to install foswiki and its ~300 plugins as it is on Debian.

This means that you can now manage your Enterprise Foswiki using the same package management tools as the rest of the operating system.

For example, I just installed a demo system with:

yum install foswiki-jhotdrawplugin foswiki-ldapcontrib foswiki-newuserplugin foswiki-glueplugin foswiki-ldapngplugin foswiki-calendarplugin foswiki-edittableplugin foswiki-interwikiplugin foswiki-renderlistplugin foswiki-smiliesplugin foswiki-tableplugin foswiki-directedgraphplugin

and when yum finished, I browse to http://server/foswiki/ and its up and running.

These packages are built by a script that downloads the latest packages from http://foswiki.org/Extensions, generates an EPM manifest and then builds rpm packages – every night. I have not yet tested them with Redhat Enterprise 6 or fedora

 

To try it out, you’ll need to add the EPEL repository, and then this one to your yum config:

 

su
rpm -Uvh http://download.fedoraproject.org/pub/epel/6/i386/epel-release-6-5.noarch.rpm
cd /etc/yum.repo.d/
wget http://fosiki.com/Foswiki_rpms/foswiki.repo

and then run

yum makecache

 

To see what foswiki extensions are available, run
yum search foswiki

To install foswiki, and some plugins:

yum install foswiki foswiki-workflowplugin foswiki-jscalendarcontrib foswiki-ldapcontrib

then browse to http://servername/foswiki/bin/configure to enable the plugin and configure settings.