Update: Stop Government Snooping

Update: Stop Government Snooping

At the moment there is a huge amount of confusion about this proposal. Newspaper reports are neither comprehensive nor consistant. I can't help but think that this confusion extends to politicians.
What we do know.
1. The IT industry is very big and it has just had the NHS IT project pulled. What with restricted spending this translates to less money for the bigger contractors. The momentum behind trying to have government commit to huge (and un-costed) IT spends is huge.
So, in short, appeals to security needs remain particularly unconvincing amidst all this dust.
2. The industry work hand in glove with GCHQ, the Police and so forth. They know very well how to put together a 'good story'. And they will not be able to be very informative about the real issues, even if they wanted to be, because of the Official Secrets Act.
3. It is possible to read up on and deduce certain things, the Offcial Secrets Act not withstanding. I, anyway, happen to know that GCHQ is already interested in precisely the technologies that would be needed to implement this sort of legislation.
4. We are told that new legislation is needed so that intelligence agencies can do X, Y and Z. But we do not know that they do not do X, Y or Z already. In fact we do not know if the Law is ill defined with respect to some activity that already occurs. For instance package sniffing might, under current legislation, be equivalent to wire tapping (I don't know this). But technically it is not the same (I do know this). Until someone really comes forward with full analysis of both the legal and technical aspects we are in the dark, but sticking to the technical, what with the demands of secrecy and the ability of those in the industry to obfusicate we still have a considerable up-hill struggle.
5. We are told something about real time mirrors, about no new data base and about Skype, text and IM. What we are not told about is how encryption will be circumvented (Blackberry, anonymous web sites such as Pider), how IM and internet telephony will be intercepted (e.g. Facebook IM, Skype) and so on. Just considering Facebook, Twitter and other social networking sites, how on earth wold this work? People can have multiple ids and connect one to one over secure connections with others who have (multiple) aliases. How can Intelligence agencies know that alias X is their target and that (assuming they can see this) alias Y is a contact of X?
5.1. In answer to the first part of the question I cannot see how that is possible apart from network sniffing and triangulation, or, possibly, spying on that person to learn their alias in conjunction with the triangulation.
5.2. The answer to the second part of the question would be by having the destination address(es) of the message from the message header made available. What I can't see is how intelligence services would know the value of the message without examining it. Indeed, how could they learn to whom the message is actually sent without further triangulation (network packet sniffing)?
5.3. I find it very difficult to imagine that, having taken the destination address off the queue, the message body would not also be inspected and analysed for key terms etc.
6. I had always assumed that intelligence services did, in fact, use packet sniffing technology. I can't understand what else it is that they want since they could do this silently, without drawing attention to themselves.
The idea that there could be legal safe guards seems farcical to me, given that there is no existing limit placed on the activity of our, or any other country's, intelligence community in this area. All it requires is a few strategically placed routers. Perhaps it is the increased commercialisation of the internet that drives this? Presently, improved infrastructure payed for by users would have traffic that is less available to sniffing?

Should add ...

Posted from kde Blog Entry Poster.


Captain log - KDE 3.5 - 4.6 MIGRATION

Well I have installed ubuntu - no actually kubuntu 11.4.

But, for some reason I remain in doubt whether I am actually using kde 4.6 or 3.x

How did I achieve this. Well mainly with trial and error. So not much help to others.

I can say that the starting point was 9.04 Jaunty Jackalope. So some big leaps.

Main thing is that there are two steps - linux upgrade and kibuntu upgrade.

Research the problem and there is a solution.

Apart from that, well am I in kde4? What has happened to nepomuk which still fails as it cannot find the sesame backend, and I cannot find how to install sesame, apart from git and then compile it.

This I will try next.

I used packagekit, but sometimes it would fail.

Then I would have to run this:-

sudo dpkg --configure -a

Here are some other commands -

sudo aptitude install build-essential xorg-dev cdbs debhelper cmake kdesdk-scripts subversion ssh xserver-xephyr doxygen dbus-x11 libxml2-dev libxslt1-dev shared-mime-info libical-dev libgif-dev libssl-dev libboost-dev libboost-program-options-dev libboost-graph-dev libgpgme11-dev libxine-dev libqimageblitz-dev libbz2-dev libdbus-1-dev libpam0g-dev libpcre3-dev libkrb5-dev libsm-dev libclucene0ldbl libclucene-dev libjpeg62-dev libxtst-dev xsltproc libxrender-dev libfontconfig1-dev automoc librdf0-dev libdbusmenu-qt-dev docbook-xsl docbook-xml libattica-dev libqtwebkit-dev shared-desktop-ontologies libphonon-dev libqt4-dev dbus-x11 libstreamanalyzer-dev libstrigiqtdbusclient-dev libxml2-dev libxslt1-dev librdf0-dev libjasper-dev libopenexr-dev libacl1-dev libsasl2-dev

sudo aptitude install libasound2-dev libaspell-dev libavahi-common-dev libenchant-dev libjasper-dev libopenexr-dev libxml2-utils

sudo aptitude install libsmbclient-dev libxkbfile-dev libxcb1-dev

sudo aptitude install libxklavier-dev libxdamage-dev libxcomposite-dev libbluetooth-dev libusb-dev libcaptury-dev network-manager-dev libsmbclient-dev libsensors-dev libpam0g-dev libnm-util-dev

sudo aptitude install libpoppler-qt4-dev libspectre-dev liblcms1-dev libexiv2-dev

sudo aptitude install libusb-dev libcfitsio3-dev libnova-dev libeigen2-dev libopenbabel-dev libfacile-ocaml-dev libboost-python-dev

sudo apt-get install libqca2-dev

sudo apt-get install libsvn-dev libboost-serialization-dev kdevplatform-dev kdebase-workspace-dev

sudo kate /etc/apt/sources.list

So this is shared with extended share.
https://chrome.google.com/webstore/detail/oenpjldbckebacipkfbcoppmiflglnibExtended Share for Google Plus - Chrome Web StoreExtends Google+ to share to Facebook, LinkedIn, Twitter, and many more.

My Google Profile

My Google Profile

yfrog Photo : http://ping.fm/GdnWk Shared by guardiannews http://ping.fm/9kteW#.TiC8j8Duiww.pingfm
yfrog Photo : http://ping.fm/iqcmm Shared by guardiannews http://ping.fm/N23nH#.TiC8j8Duiww.pingfm

On Borrowed Time

I have just finished reading my brothers report:
'On Borrowed Time
Avoiding fiscal catastrophe by transforming the state’s intergenerational responsibilities'

It is available here:-
Adam Smith Institute, author Miles Saltiel: On Borrowed Time 8/12/10 PDF document.
Link updated to own storage 15/05/2017.
It will never catch on. The fundamental objection to the argument for a return to laisser faire is that it would expose people to the harshness or an un-reconstituted market such as we already see with the immigration of cheap labour where people are prepared (are forced and exploited into) living several to a room.
However, I find myself very sympathetic to the thrust of his arguments and the vision he articulates.
I suppose I believe that the very faulty and incredibly wasteful of human potential system we have is a social compact against worse human behaviour that is the risk of laisser faire.
In short, I don't think we are capable of it, despite its appeal.
Read his report and come to your own conclusions.

The Nokia 1100 is selling for over $30,000 - to hackers

The Nokia 1100 is selling for over $30,000 - to hackers

Em. But foaf+ssl would work after any bank authentication procedure. I don't think the way banks abuse TAN would be relevant to this. This is interesting because tends to show the potential security of the approach.
Needs investigating further though, and steps fleshed out for different scenarios.
Naomi Klein Oil Spill - Guardian 19/06/2010 http://ping.fm/PtYj5

From the top

From the top

Image by art_ikon via Flickr
I realise that I am never going to fit into my work environment.
I don't think that I can blog about this impartially. But, perhaps, usefully?
Before the work, specific lets set the scene.
We have a government that wants to grow and promote the intellectual capital of this country.
How are we doing?
The answer is probably surprisingly well despite, not because of, the government.
This applies particularly to my field of IT.
Translating requirements into an application is one thing, but translating efficiently, creatively and offering true value for money is something entirely different.
It is quite clear that government has no criteria for the later, and therefore, no depth of understanding of the subtle consequences of decisions that ensue from contractual arrangements downwards.
Largely the government in their many IT endeavours are ensnared by large IT companies who make it their business to justify the highest possible costs over the short and medium term.
Because those companies cannot show net profits above around 5% in the public sector they find other ways to make money: the most common pattern is that of over complicating requirements gathering and elongating the length of time it takes to fulfil an item of work. Attendant patterns are non-cooperation with IT partners and back loading costs to non-IT ancillary services.
All of this comes about because the Civil Service is inattentive to contractual detail. They are undermanned and under skilled in their oversite roles. They also consistently chose large suppliers rather than a series of small suppliers, which means that they delegate structural organisation in a way that discourages competitive innovation.
Let's look at the consequences of this.
The government could be promoting exemplary projects, and it would be important if they did so.
Three truths:
IT is a fluid field with much to learn, new ways of doing things on every level of the project. Experimentation coupled with uncertainty is the norm. This is so much the case that it cannot be said that further experimentation necessarily increases risk (within some parameters). Experimentation, trying things where results are uncertain, can reduce risk.
This truth is fundamental to understanding good IT governance.
The second truth is that top-down governance is intrinsically flawed. The larger the pyramid the larger the mass of detail that is essentially unknown and, hence, contains hidden risk.
The third truth is that large pyramids are intrinsically unstable and dangerous when one needs to interact with another, at whatever level up or down the pyramid.

The long, flat base

Image via Wikipedia
Notice how pyramids, as such, are functionally useless and are not part of a modern construction repertoire, by which I do not mean pyramid-shaped buildings, but actual pyramids. But emblematic they are and rightfully so. Engineering is more about bridges, road and rail lines of communication and various sorts of buildings, depending on function.
Pyramids, though, are emblematic of human structures, not without reason. The repetitious work is greatest at the base, hardest higher up. Slavery is equal at any point. How many people had to be ground to dust to prevent the remaining one of two that were hidden their midst from also being ground to dust? Large companies are similar, unless they are very skilful in their people management.
This is the problem:-
The broader the base the more pointless and tedious each component task, because each task at the base withstands a huge amount of pressure from above.
This is not intelligent design.
The pyramid, itself, is not an intelligent structure. But even if it were a bridge there are trade-offs between one massive bridge and several small ones. For instance one massive bridge would never have worked as a solution for bridging the Thames in London.
But this is what government does with IT, completely unnecessarily given IT's flexible and scalable nature.
These solutions have three disastrous consequences.
They cannot be efficient solutions.
They cannot be optimum solutions because they squeeze creativity out of those at the bottom who must implements them.
They cannot be economically competitive because they distort the market and deprive smaller companies of opportunity. What competition there is is at the expense of the first two points and this is ultimately disastrous for any policy that is meant to promote intellectual capital in this country.

Project National Health - policy driven IT

Image via Wikipedia
How can an IT project grow from 2.3 billion to over 12 and no-one question the manner by which the project is set up? I find it very odd that any country can engage in such a huge project in such a wasteful way. We know that government want to mitigate risk by risk transfer and that can only work if the company is sufficiently solid to bear the putative risk. At first sight this seems obvious and good business. However, look at the figures and imagine the cost of this risk mitigation. No sensible project would be underwritten to that degree. The reality is that large contractors revisit the risk on the government, just as the banks have. And the reason for this is poor, unimaginative management.
The NHS IT project has been an absolute disaster for the tax payer and for the IT industry in this country. For the former in terms of value for money and for the later in terms of boosting innovation and competitiveness. What I find very difficult to understand is how it is that a government can expound the virtues of intellectual capital and competitiveness on the one hand and act in such a crass and destructive manner on the other?
Enhanced by Zemanta
The reality is that large contractors revisit the risk on the government, just as the banks have.
The dynamic is not the same as it is an absolute that the banks should not fail, but, in that this project is a flagship of government policy and the government does not want to be seen to fail, this project cannot fail. Except that, by any sensible measure, it already has failed.
Feed the insatiable appetite of large IT companies has decimated innovation that can only thrive in the competitive environment that is fostered by healthy small firms.
To understand all of this fundamental principals have to be revisited.
Definitions of intellectual labour need to be understood, especially in the context of IT engineering, something that government simply just does not get at all.
I shall lay some trails.
One of the strong motives to Open Source software is subversive, while the other is that of wishing not to have to repeat behind closed doors what may need to be done only once if the doors are open.

SemanticC: Open Contract http://ping.fm/xnWjM