A few more thoughts, the economics of software development is interesting to me mostly
for the fact that it's so unlike the manufacture of physical products (ala Babbage's
interest), in that it has fixed costs but virtually nil variable costs. Leading to
many desirable and not-so desirable effects as can be easily enough imagined.
The nexus between design and genuine software innovation is interesting, one example
being the comparative merits of SVG (a valuable 'information engineering' innovation
I believe ) and Flash (advanced via sales of 'designer' software tools). The purchase
of Macromedia (Flash) by Adobe (early advocates of W3C SVG (as a declarative Postscript?))
would seem to have marked the death of SVG, but then innovations to improve browser
rendering speed, and market positioning perhaps, have just possibly seen a revival
of SVG after a hiatus of 10 years.
I feel the Cloud computing phenomenon will be less interesting for its cost benefits
(as a result of security and vulnerability issues) than for its disruption of status
quos, in that it takes the low variable costs factor to its logical conclusion, which
is to charge not for an up front purchase but for suitability to specific purpose
and ability to evolve, integrate and scale dynamically (a service-oriented approach),
NK stand up.
Regarding yet another interesting newsletter from the world of ROC
The notion of an 'information engineer' is an interesting one that I've given a bit
of thought to over recent times. I've a slightly different point of view, that the
trend is more towards system 'designers' as opposed to 'engineers'. This is with the
view that the main trend has been from computing as a mainly mathematical discipline
in its earliest stages (Donald Knuth being the exemplar I guess) more and more towards
empowerment of the common person to make use of the computing power of CPUs without
really having a clue how it all works at the hardware or even the code/algorithm level.
I see design as being more of an intuitive, connecting the bits, systems kind of thinking
than a understanding of how to squeeze maximum performance from limited resources.
Its interesting that the idea of patterns as expressed by an architect originally
(A Pattern Language) should have been adopted by programmers as a means of communicating about high level
One thing that interests me about NK is that it seems to fit with this concept of
a 'system designer', in that it provides a visual interface with which to connect
up the bits and then, very importantly, to observe the resulting system in action,
potentially without having to know a lot about the finer detail. The concept of the
'leverage' of tools is a useful one I think, in terms of using the right tool for
the job (but maybe confusing with the 'fulcrum' analogy already in use in NK terminology).
In looking at use of UML I came to a conclusion that a true systems modelling language
should enable you to test alternate modeled scenarios by "running the model". Executable
UML seems to be an attempt at this, but when you look at what is available from this
line of thinking you get nowhere fast (IMHO, just some very technical books that only
an engineer (just maybe) could digest).
To me NK seems closer to an useful design by modelling approach (or 'trial and error
modelling' which is just to acknowledge the limitations of the human mind). I think
the Domain Driven Design approach is possibly the last word on the usefulness of conceptual
modelling , that defining a 'ubiquitous language' (semantic model?) is a useful starting
I came across a quote recently from a Balisage paper by Kurt Cagle on REST that (to
paraphrase); some people see the web as a giant application (framework?) but others
(the REST camp) as a giant database (of abstract resources). For me this comparison
was insightful, my "big issue of the times" is how to enable system designers to make
use of this web-scale 'database' efficiently, which is what the W3C Semantic Web is
all about obviously, and on a parallel path, I think, what ROC is about. But at the
same time it seems that meeting this challenge is, to a degree, going back to the
way of the artisan (whose abstract design 'patterns' were considered so useful by
Alexander) than to the efficiency of mass production.
But unfortunately in my experience talking of 'semantics' and 'abstract resources'
is just confusing to otherwise smart people that presently think in terms of two dimensional
spreadsheets and relational tables, so the challenge that interests me most is how
to provide 'tools' that allow people (other than engineers) to move across to this
alternate, significantly more powerful way of working with information. I think the
start is in view in terms of things like the Taverna Workflow environment and nCode
This is a non-academic subject in the sense of who people go to for help to build
their systems. We are currently dominated by large companies of (supposedly) expert
systems 'engineers', peddling expensive one-size fits all solutions when the really
significant potential, to me, lies in smaller 'shops' of specialist 'artisans' mainly
with a custom design focus (and where experience actually has a monetary value for
those experienced, rather than mainly to the owners of the businesses capital. (Did
someone mention Sharepoint?).
The main design related factor that led to the success of the web is that the cost
of adding to it is nearly always cheaper than the value of doing so. That is, it encouraged
people to add to it by this basic economic truth (via the apparently simple tools
(HTML/HTTP) having great 'leverage') In taking it to the next level the tools need
to be similarly 'powerful', NK seems to me to be part of that picture.