You are on page 1of 77

Knowledge architecture for socio-economic [Year] policy analysis

Creating a knowledge architecture for socioeconomic policy generation


MSc Dissertation Paper

Paul Suciu 29740 words

Knowledge architecture for socio-economic policy analysis

Contents

Introduction.................................................................................................................................................... Heuristics and iteration............................................................................................................................. Policy literature review.............................................................................. Introduction to knowledge architectures....................................................................................... The project work logs.................................................................................................................................. Conclusions...................................................................................................................................................... Bibliography..................................................................................................................................................... Annex 1 A short visual introduction to Liquid Feedback.............................................................. Annex 2 Traditional socio-economic policy analysis schools................................................... Annex 3 Short introduction to Semiotics............................................................................... Annex 4 Educational externality......................................................................................................... Annex 5 Political implications of the knowledge architecture..................................................

3 5 8 14 35 47 51 55 62 63 67 73

Knowledge architecture for socio-economic policy analysis

This paper proposed itself to investigate the application of open source knowledge architectures to the volatile field of socio-economic policy generation, starting with an exploratory research, grounded in heuristics and iterative methodology and in the philosophy of deconstruction. It then moves on to a more pragmatic implementation through the use of the Liquid Feedback crowd-network policy enabling platform, showing both the capabilities of the software and possible pathways of improvement of its consensual decision making method and the quality of the policy generated. Finally, in the study case of the EU labor market it attempts to enhance the policy generation process adding end user friendly populated visualization methods, such as the Timeglider widget API. Paul Suciu

Knowledge architecture for socio-economic policy analysis

Introduction
Background
The main reason why this will be a communications oriented perspective on economics and not an accounting one, is that as a one year masters graduate in Accounting and Finance I felt I couldnt speak from an authoritative position on any particular economic policy matter. 1What I can do however, after a four year BA in communication and various IT and project management experiences is to envision/facilitate a process of economic policy transfer/decoding at community level 2. Through experience I came to the conclusion that personal opinion in enabling change is rather superfluous. I used to engage in fierce, endless debates on matters of policy, which, because of my training as a communications specialist, I would often win, until I came to realize that what matters in practical community issues isnt a private opinion, but the best possible opinion adopted by the largest majority and that the process by which one arrives there has a major impact on its finality. I was trained as an economist, but discussing complex economic frameworks is a highly problematic issue, as the level of inconsistency present within the field has led to calls from academics of deeming it a pseudo-science (Taleb, 2007 and Zhu, 2009); the name Economics is misleading, as a variety of properly supported codes and languages are gathered under this misnomer (Zhu, 2009). So why did I bother paying 5k/year for an Accounting degree, when my final paper is so removed from the field? It has to do, from my perception of the fundamental enforcement of business realities within an institution. Money and their flow, allow for survival within Darwinian capitalism and its this survival that offers legitimacy to an enterprise. As an outsider I lived under the illusion that the field is forced by its numeric orientation to adopt a much more rigorous framework of systemizing reality, than your run of the mill social science3.

As unfortunately, a skills based program such as my Accounting degree hardly renders itself to academic banter and critical thinking, beyond its immediate numeric concerns. 2 Pessimist would say I am deluding myself, but I have in fact witnessed a number of times (mostly studied the cases) how policy was in fact changed by individual action, with substantial results. A recent example would be the defeat of ACTA at EU Parliament level, after the mobilization of the IT community. This particular instance didnt take years, but merely weeks to accomplish. In my view, as I kept a close watch on the issue, the debate came down to one issue, on how the online community managed to organize itself in a superior manner against a back door policy laundering lobby sphere, through better IT literacy. 3 Where in the world can one find a measure of stability if not in the most fundamental/stable part of the economy? I was, of course, unaware of the many compromises that currently exist within the IFRS adoption of principles , sources of friction or debate. 3

Knowledge architecture for socio-economic policy analysis

I do however remain positive and in agreeing with Gabbay and Hunter (1993) that meta-language/rules have the role of reducing inconsistencies in an improperly formulated language and having noticed an attempt to methodically implement such a language within the IFRS framework, I decided it was worth a closer inspection. While codifying the IASB framework, the IFRS aims to create its own meta-language, which can bring some consistency to a field marked by fundamental changes in the recent years4. It is a slow and arduous process, taking years between agenda consultations and postimplementation reviews5. All the while, the IFRS has also attempted to codify accounting principles for machine code, so it in fact created a parallel, this time rather proper metalanguage in the form of activities the XBRL taxonomy for IFRSs and the IFRS for SMEs to facilitate the electronic use, exchange and comparability of financial data prepared in accordance with IFRSs6. This for me was an example that a concerted effort could be attempted towards codifying an entire epistemological field and that the symbolization of a limited reality was possible and that given enough resources one could arguably systemize the extremely volatile field of policy making7. The main difference between the IFRS approach and my interest lies in the backing, while IFRS is supported by private initiative, the kind of policy I envision involves the larger public, organized in a community of thought. It is my strong belief that eventually the community approach to socio-economic policy discussion will be the only established one, acting as a foundation on which the private sector will be able thereafter to build a unified and coherent framework of business that I can unapologetically adhere to.

It is also an extremely flawed process, highly contested by its many contributors, even at the level of its most basic assumptions, such as the asset definition. 5 http://www.ifrs.org/Current+Projects/IASB+Projects/IASB+Work+Plan.htm, example and current developments 6 That is because machines dont understand the nuances and possible contradictions of human communication and need a clear code to parse . 7 And the above is not a singular model of development, with similar areas of policy formulation being undertaken at all levels of the EU and beyond, from a variety of perspectives and interests. 4

Knowledge architecture for socio-economic policy analysis

Heuristics and iteration


Research question
Is it possible to build a knowledge architecture for generating higher quality/quantity socio-economic public policy, moving from the current least common denominator/populist approach (by eliminating various types of bias) to a better interaction/utilization of the mass user? Can we also make sure, through the utilization of open source software, that the emerging user community has the tools to re-actualize itself continuously in such a manner that it will improve upon the policy generation process? What are the current developments in the field and what can be done to improve upon them? Ideally, can we build an observational model/proof-of-concept for the theory identified?

Community of intent/community of knowledge/community of production


By bringing community support into policy generation we are attempting to raise the quality of the political discourse and create a better product, both because of the higher numbers of interested individuals involved and because of the easier adoption of policy, as the impacted group would be same one that generated it. However, since we aim to avoid ending once more with the lowest common denominator from a crowd unable to articulate a consensual efficient/final position8, we must also describe a mechanism of participatory learning and genuine executive capabilities within the community - which raises the major issue that before anything else we must envision/create a community9. There should be an economy of production in relation to policy, just like with any other commodity. In the same manner the management accountant has at his disposal a formidable ERP system10, that he can feed data and receive results and updates within minutes, so should the political individual be able to make up the best plan, based on easily understandable, processable and transferable data (therefore valuable to him). Proving the
Both the delegitimized traditional power structures and the grassroots activism movements that seek to replace them, suffer from the same weakness, a difficulty to articulate purpose and the lack of clear operational and managerial frameworks (Tushnet, 1995). 9 Social formation in the last instance... is not the spirit of an essence or a human nature, not man, not even men, but a relation, the relation of production (Althusser, 1971, observation on Marxs social materialism). It is necessity that moves people to act together and not idealism. This is supported by Charles Pierce (1931) it is hard for man to understand this, because he persists in identifying himself with his will. This limitation of the self in respect to production output is a truth that most individuals come to realize on their own and it enables them to seek the support of others in tackling difficult issues. And concerted effort does mean co-opting as many members of society and giving them the opportunity to satisfy their own needs, thou in this case mostly the need for self-expression and legitimacy at/through community level. 10 Electronic resource planner 5
8

Knowledge architecture for socio-economic policy analysis

regular citizen with easily manipulable to support his decision making process in matters that concern him, is an imperative dictated by the rule of many 11democratic principle. Why the need for higher quality/volume information? The problem is that current policy debate/political discourse are often reduced, because of the exploitation of cognitive bias, to the lowest common denominator (populist approach) expressed to only a few points of view, mainly the binary of left and right, relegated to a four year circle and highly unsatisfactory in a consumer society where individuals get to vote on preferences almost every day12. Without a sense of control over the political process and therefore personal selfactualization, we end up with most of the voter core feeling delegitimized and unmotivated.

Heuristics13 and iteration as a research method?


There are a myriad of issues to be tackled in a practical policy design implementation as opposed to merely defining and analyzing in a standardized format an academic issue. Theory and practice arent perfectly aligned even when one knows exactly what the outcome should be, never mind when operating on a fluid concept that changes as new data becomes significant through exploration. While the project design and scope might change, the one thing that cannot be changed is the limited capacity of one individual. In a world of incertitude and change it is now recognized that the human mind employs a variety of shortcuts, which were only fully recognized when the same principles had to be employed by IT in designing programming languages14. These methods of learning are what we call heuristics and iteration15. There were many instances where I operated in the dark within the project, especially within the programming environment, based only on the conviction that I will succeed in overcoming any obstacle, if merely by following strategic/topical cues and guiding myself not on the principle of the best solution, but on the convenient solution. This is what Wikipedia16 defines as heuristics referring to experience-based techniques for problem solving, learning, and discovery. Where an exhaustive search is impractical, heuristic methods are used to speed up the process of finding a satisfactory solution. Examples of this method
rule by the many is polity (which gave us policy) in its ideal form and democracy in its perverted form, according to Aristotles theory of democracy (Encyclopedia Britannica Online) 12 The ubiquitous Like button. 13 Cognate of eureka or heureka, meaning to find, refering to a trial-and-error method of investigation, when an algorithmic/structured approach is impractical http://wordinfo.info/unit/%20781?letter=E&spage=6 14 In computer programing a code is parsed or interpreted by a program employed by a PC or network of PCs according to an algorithm and set outcomes. It is possible to operate an algorithm in the presence of uncertainty by applying heuristic methods of outcome, where uncertainty is either rounded up to the closest convenient/practical value or completely ignored. 15 while these terms are not known to most socio-economic scientists, they are extremely familiar to programmers, as creating a program that will not crash on meeting an unknown/new value is a constant design challenge. 16 http://en.wikipedia.org/wiki/Heuristics 6
11

Knowledge architecture for socio-economic policy analysis

include using a rule of thumb, an educated guess, an intuitive judgment, or common sense. While the Wiki quote might seem mundane17it does however provide a link to the pragmatic interpretation of heuristics as a method in IT where according to Pearl (1983) heuristics are strategies using readily accessible, though loosely applicable, information to control problem solving in human beings and machines. We see that the programmer makes little distinction between individuals and machines in this context, as language parsing is an inherent function of both the human thought process and Iteration on the other hand is a lot easier to understand, as it means the act of repeating a process usually with the aim of approaching a desired goal or target or result. Each repetition of the process is also called an iteration, and the results of one iteration are used as the starting point for the next iteration (Wikipedia page). In respect to this paper, not only am I following the iteration/heuristic model, but I aim to make it a part of my design, to transfer it to the crowd as a method of learning and generating policy output (and as I mentioned earlier, the project conceptual/building process and its ultimate functionality are intimately and inexorably linked). You can understand that attempting to frame theoretically a real life process is a massive hurdle. For one thing, theres a limitation on the theoretical specificity one can bring to the issue, otherwise the ramifications would make it impossible to conclude the project. A degree of specificity is a MUST, otherwise there would be no discernable original ideas in a wide field of knowledge. I wish I could say that this research is about analyzing the primary data generated through external user interactivity with the exploratory tool, however since the building project is a part of a much bigger picture, extending at least a couple of years into the future, that is not the case. Instead the focus of this particular Masters paper will be in analyzing the qualitative primary data generated by managing the building/integration of the various policy generating/enforcing features within the IT platform and the ideas and threads generated by such a specific endeavor. Beyond the numerous technical details, there needs to be a real focus on the need to deliver a true and fair18 policy representation perspective. In this I choose to believe that I can borrow heavily from the structured approach promoted by accountancy, through its IFRS Framework. The systematic approach in data collection here tends to be quite heterogeneous and will undoubtedly generate much more qualitative, than quantitative data 19(Saunders et al, 2007). At least in part, I would define this as a managerial control process, where the data collected will be trans-disciplinary in nature (IT, communication and economics).

17 As opposed to academic propriety, despite being one of the better definitions out there, offered through community debate and support, a design which this paper wholeheartedly promotes. 18 from http://www.frc.org.uk/about/trueandfair.cfm while these guiding principles are difficult to attribute to a particular author, despite being at the center of accounting practices in the UK for a very long period of time, their application is closely monitored by the Financial Reporting Council 19 An similarly framed conclusions

Knowledge architecture for socio-economic policy analysis

Policy literature review


Policy taxonomy20and structure
Structure is paramount for this subject, as the various policy components have to be represented within the project, easily identifiable at the analytical level, debatable at the decision making level and easy to communicate during the agenda setting/implementation process. By structure, policies generally possess a: Purpose statement (almost like an abstract) Applicability and scope (allows them to be organized and monitored) Effective date of coming into force (except for retroactive policies) Responsibilities of key players Policy statement Background (allows us to understand the policy) Glossary of definitions (dictionary) In its simplest form, the policy analysis model follows these basic steps: 1. Agenda setting (Problem identification) 2. Policy Formulation 3. Adoption 4. Implementation 5. Evaluation Althaus et al (2007) propose an 8 stage policy analysis cycle (figure 1), based on heuristics and iteration, easier to manage than the traditional model presented before, which is based on the assumption of previous expertize in policy matters21. A policy cycle is a first foray into complexity, organizing observations into familiar patterns and so providing a guide to action. Unlike the traditional, hegemonic model, theirs considers a broader range of actors involved in the policy space that includes civil society organizations, the media, intellectuals, think tanks or policy research institutes, etc. Going beyond the scope of policy generation, the rational planning model RPM (for systemic/pragmatic process organization) intends to enable the user to attain the best possible solution, by following a systematic approach in his heuristic endeavor. It is not only easily
Althaus et. all (2007), my inspiration for the heuristic and iterative policy generation model. professional staff in large government departments. They too were often required to realize significant public policy goals armed only with their disciplinary training and some bureaucratic experience. Even basic civics sometimes proved unfamiliar to those trained as engineers or lawyers. They needed a bridge from technical expertise to the policy domain. (Althaus et al. 2007)
21 20

Knowledge architecture for socio-economic policy analysis

applicable to policy generation, but also serves to illustrate how the process could be seen from an input/output perspective, in a relational grid (figure 2), extremely familiar to IT programmers (Levinson, quoted by King, 2005).

Figure 1: The Policy Cycle, which Althaus et al (2007), describe as a heuristic model -

Figure 2: The Rational Planning Model (Levinson, 2005).

Knowledge architecture for socio-economic policy analysis

Policy as a sign/signal/code22
Chandler (1995) says that the conventions of codes represent a social dimension in semiotics. A code is a set of practices familiar to users of the medium operating within a broad cultural framework Society itself depends on the existence of such signifying systems, then continues to say that codes arent just simple conventions, but procedural systems of related conventions which operate in certain domains, which transcend single texts, linking them together in an interpretative framework. He then goes on to quote Stephen Heath (1981) in that a code is distinguished by its coherence, its homogeneity, its systematicity, in the face of the heterogeneity of the message, articulated across several codes. Codes help simplify phenomena in order to make it easier to communicate experiences (Gombrich 1982, 35). Signs and codes are generated by human institutions and in turn serve to maintain them, either through self-propagating myths or careful gentle symbolic insemination 23 . According to Chandler (1995) the most important issue that concerns modern semiotics is that we are not merely the slaves of authority generated ideology, but active assigners of meaning, in the smallest details of our lives. And in that we transcend that old imperative described by Sartre in his theory of Being, by not merely being the observer or the observed, but the painters of our whole universe.

Enabling a community epistemological 24 network25 John C Lillys early, simplistic definition (1968) of the human bio-computer had a lot more going on for it than initially thought. By envisioning the mind as a very sophisticated machine, Lilly has allowed us to take the next logical step and envision society as a network experience.26 Because of sheer size, the products of this network tend to be vastly superior to one created by corporate users, provided theres a network wide demand. Policy does in fact meet this criteria and the only thing that remains is supplementing the capabilities of the human network, with an IT architecture that would simplify decision making, allow for easy visualization and give the sensation of control to the individual user and self-actualization to the community and many others that were inaccessible before the advent of social platforms.

Why codes? As a child in an insecure world of change I have actually attempted to create my very own, highly hermetic code of communication. To put is simply, I was a graphic design child prodigy. Even now, that particular code (which was by no means restricted to the visual and I have jealously guarded/continued to develop), is underlying my every action through its ideological influence and I feel compelled to justify my particular queer existence, through expressing social utility. 23 See the Inception movie 24 knowledge 25 In this day and age where a ridiculous amount of educated people cant find an application for their abilities, I intend to offer to all those linguistics/foreign relations/arts underemployed individuals a chance to participate in an organized, socially beneficial activity 26 With the commons sense limitations of such mechanicism 10

22

Knowledge architecture for socio-economic policy analysis

The problem is that individuals are not only limited in their individual/communal capability to process code, but are also subjected to various types of bias, ironically because of their own heuristic methods, used to mitigate uncertainty and promote the self. An exhaustive list of such biases has been provided by a mixture of cognitive science/ behavioral psychology and others and its too wide to discuss here27. In disrupting complex structures, ultimately bias tends to be polarizing, which is why, we end up with a left and a right for a political spectrum, a choice between 0 and 1, which in itself represents the simplest programing structure possible in a chaotic network. This yes/no design needs to be upgraded with non-polarizing ones, such as the Wh+ group of Who? Where? Why? What? When?.28 There are multiple cultural connotations on a seemingly common denotation. Even the most natural, culturally well-adjusted term is culture-specific, bringing up personal associations (ideological, emotional etc.) of the term. These associations are related to the interpreter's class, age, gender, ethnicity and so on (Wilden 1987, 224). Not only do I want the community I envision to be able to generate its own code interpretation, I want it capable of understanding/analyzing overarching and competing codes. At its most basic level, the site should serve as a very sophisticated tool of policy code breaking/reconstruction, using the best resource available on the market, the human brain (CAPTCHA case29).

Deconstruction method/philosophy
As soon as I was able to articulate the title of my initial site iteration30, I began conscientiously employing the neo-structuralist method of deconstruction to my approach to

I will attempt to address the issue of cognitive bias mitigation at a later date in the project, after a better observational understanding of the matter, as I am currently concerned only with structurally inherent bias. Suffice to say that such bias can be mitigated by allowing individuals the IT tools to build communally agreed structures of thought, whether based on individual observation or on communal one. 28 I must point out that the rather imperative yes/no refers to a given choice (often sensitive to compliance), characteristic of hegemonic policy generation, while the Wh+ method forces the research of factuals and gives a much broader range of choices, as characteristic of a diffuse network environment. The Wh+ method is also the preferred one for information gathering in social sciences. 29 A CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) is a small image used for security purposes to ensure that the end user is human and no software bot is using the web server resources. The problem is that its strength, human recognition is also its weakness, and human code breaking networks have been established by varied means to take them down. http://www.boingboing.net/2004/01/27/solving-and-creating.html a curious future, where commodity pornography, in great quantities, is used to incent human actors to generate and solve Turing tests like captchas http://web.archive.org/web/20071106170737/http://ap.google.com/article/ALeqM5jnNrQKxFzt7mPu3 DZcP7_UWr8UfwD8SKE6Q80 Paul Ferguson, network architect at Trend Micro, speculated that spammers might be using the results to write a program to automatically bypass CAPTCHA systems. I have to hand it to them," Ferguson said, laughing. "The social engineering aspect here is pretty clever. Maybe I should follow this model to entice users, just as Jimmy Wales from Wikipedia did 30 www.deconstructingeurope.co.cc , with the important word being deconstruction 11

27

Knowledge architecture for socio-economic policy analysis

policy analysis. It is my hope that the lowest common denominator will eventually shift from left-right swings to a deconstructive process. Deconstruction31 by its scope tends to constitute itself as a challenge to established structures, and policy symbols and codes of communication are no exception. But being a part of semiotics, it doesnt represent a method per se, but a rather a philosophy of approach on which proper and specific processes must be built for operational efficiency.

Textual medium
The complexities of community/consensual decision making32 at policy level can only be manifested in todays society through utilizing the most conventional medium, the textual one33; its simplistic in its component elements (the letters) and yet because of that very simplicity, able of supporting greatly nuanced textual structures of symbolic meaning. The textual metafunction34 acts to form texts, complexes of signs which cohere both internally and within the context in and for which they were produced (Kress et al, 1996). That coherence in itself doesnt strike anyone as particularly impressive, but it has a harmonious effect on policy formation as part of a larger corpus35 , with the effect of eliminating dissent as soon as it appears. That might mean that changing policy could very well entail a concerted effort to engage/change an entire corpus of legislation36. Textual determinism following adaptive expectations/path dependency 37 is an inherent part to the way information is processed through such the impersonal textual medium. Text is highly susceptible to external influence, especially when its repository is under a singular entity control38.
31

One of my favorite concepts about deconstruction, is that is doesn't challenge anything directly, it attacks the paradigm and them watches as the construct or the needed bits of it, fall on their own (the purpose here is to reconstruct the idea in a more code friendly manner). 32 complex decision making as I like to call it 33 From analogy/emotion to digital method. Written convention eliminates the unpredictability of emotions and emphasizes consensus through eliminating the dissent of face to face interpretations. The process of learning further makes choosing the best available alternative as a willing/discovery process that binds support through constructive thinking. 34 Metalanguage/metacodes function in the background at the same time as our main observable code, but are not usually perceived by the untrained user. Gabbay and Hunter (1993) argue that metalanguage has the role of reducing inconsistencies in an improperly formulated language. As most codes can only be understood in reference to other superclass/subclass codes, it is essential for policy not to be analyzed merely on its own, at face value. Whenever metalanguage is used to simply plug holes in policy design, we are offered an opportunity/lever to deconstruct/reconstruct said policy up to a point where a better paradigm becomes available. 35 Or through various component definitions. 36 As for example, one of the proposed pathways for legal changes at higher EU levels refers to the challenging of national constitutions, which is as fundamental as you can get. In a sense the rejected EU constitution did just that, attempting to challenge established policy corpuses. 37 Page (2006) 38 History is written by the winners, etc. 12

Knowledge architecture for socio-economic policy analysis

Code parsing/compiling
Parsing, a term used both in linguistics and computer science, is the splitting of formal language/code into its smallest units or tokens, which can be used thereafter for syntactic analysis. In case of policy these tokens will be constituted more or less by terms eventually included within dictionary taxonomy. Unfortunately, while policy is based on a more formalized language then common speech, it is still in many ways tributary to natural language, unlike computer code which operates in a context-free environment. That is why a number of different policy analytical approaches might be desirable from a code perspective: Lexical analysis, splitting the language into morphemes (smallest units possible, in the case of policy being constituted of concepts, not tokens, of iconic representations, not symbolic ones). Syntactic analysis, where we notice the connectors and the sentence statements, where we can employ paradigmic and syntagmatic analysis 39on the code syntax and its other quantitative parameters. Semantic analysis, where we integrate the complex data remaining and we create a full perspective drawing from available taxonomies

Taxonomy A word that kept repeating itself within my research, until I came up with the realization that I was working within a field that desperately needed such a tool to unify what the specialist bias of various fields has hermetically isolated through various interpretations of the same topic, in protecting particular spheres of influence. Therefore I had to create the simplest project taxonomy, to be used as a frame of reference and relationships, enabling me to clearly analyze the code building process, its functionality and finality. Eventually, I intend for this aspect to become a community managed dictionary.

39

or and and, the simplest Boolean logical operators 13

Knowledge architecture for socio-economic policy analysis

Introduction to knowledge architectures


Although policy development and enforcement itself is a political or cultural process, not a technological one, technical systems architecture can be used to determine what policy opportunities exist by controlling the terms under which information is exchanged, or applications behave, across systems (Taipale, 2004). We have seen the difficulties arising from policy discussion as a subject/code. Its clearly impossible to run such a complex code only on a human network40, therefore in this chapter I intend to showcase the theory for a support IT architecture41.

Fig. 3 Policy management architecture, client-server reference model (Taipale, 2004)

In the simpler server-client form, matching the Shanon and Weaver original communication model (1949), Taipale (2004) presents his architecture such as in Figure 3 before.
knowledge architectures cannot be upgraded exclusively anymore on pure human social networks while Im using Taipales work (2004-2010) to legitimize and further evidentiate my own, my product isnt based on his models, as I came onto it rather late in my literature review process . What Taipale (2004) calls Policy Management Architecture, coming from the control/moderate/enforce policy position, I call Knowledge Architecture, a more generic term that emphasizes the need for community education and self-determination.
41 40

14

Knowledge architecture for socio-economic policy analysis

The difference is that his architecture is distributed across the Internet and doesnt attempt to facilitate user feedback and convenience on one branded site structure as is the case with my plan. The end user is less of a policy generator and more of a receiver, according to the hegemonic model42 . Of course, one could argue that in his network-stack model (figure 4), Taipale addresses the issue of user cooperation and leaves out the anonymous originator, but counterarguments can be made that: a. The application layer, with its forums, collaborative work tools, directories, etc. is too distributive (spread across the Internet into factions and groups) to be able to offer a consistent alternative to the hegemonic policy generator. 43 b. The audit tools remain once more the prerogative of a limited group, anonymous in its intentions and presence.

c. The legacy data repositories described are extremely difficult to access for the average civic user44. d. No policy code description, despite the author being fully aware of the importance of syntax/semantics/pragmatics in this style of communication.

42

This simple model allows us to view the main points of contention: 1. It has a distorted feedback loop, with an anonymous source as the originator of policy and in charge of semantic control on the server side, with the end data user required to subscribe 2. Theres an implicit gatekeeper/monitoring element in the oversight logs, which means control of the architecture is not in the hands of the end user 3. It doesnt address the community user

a. Not only that, but it is unlikely that such a distributed layer will be readily accessible to the civic user, who will once more find himself as merely a receiver of policy, created at a plutocratic level. The only way the end user can be motivated enough to use this system, is to be legitimized through the power of the many, the social user, which isnt addressed as they are in non-standard formatting and difficult to visualize. The current design is only accessible to the most educated of policy readers, who from their expert position become in effect the leading policy generating plutocracy. 15
44

43

Knowledge architecture for socio-economic policy analysis

Fig. 4 Policy management architecture, network-stack reference model (Taipale, 2004)

Introducing the notion of middleware45/appliance


Basically, middleware is the administrative component of the architecture. A laymans split into functionality by design, in the context of policy architecture would give us (with the most popular and invariably open source implementations): 1) Scripting languages allow for the automation of the myriad scripts that run in the background, allowing users to focus on website interactivity. a. PHP, the most popular server side scripting language, designed for the generation of dynamic web pages (which allow for user interaction, control and modification of lets say an SQL database). Liquid Feedback is written partially in PHP, for standardization purposes (the backend). The frontend however is written in WebMCP, an rather unknown competitor of PHP, which allows for enhanced client side features, over the PHP limitations.

In enterprise architecture for systems design, policy appliances are technical control and logging mechanisms to enforce or reconcile policy (systems use) rules and to ensure accountability in information systems Any form of middleware that manages policy rules -- can mediate between data owners or producers, data aggregators, and data users, and among heterogeneous institutional systems or networks, to enforce, reconcile, and monitor agreed information management policies and laws across system (or between jurisdictions) with divergent information policies or needs. Policy appliances can interact with smart data (data that carries with it contextual relevant terms for its own use), intelligent agents (queries that are self-credentialed, authenticating, or contextually adaptive), or context-aware applications to control information flows, protect security and confidentiality, and maintain privacy. Policy appliances support policy-based information management processes by enabling rules-based processing, selective disclosure, and accountability and oversight. (Taipale, 2004). 16

45

Knowledge architecture for socio-economic policy analysis

b. Javascript, a client site used object scripting language, primarily employed because it enables enhanced interfaces in dynamic pages. The visualization widget Timeglider, employed here for creating timelines, is a library of Javascripts. 2) Relational databases, SQL in this case is a database query language, with the PostgreSQL object-relational database management system required by the Liquid Feedback software for user logs. Postgres is employed by a multitude of scalable architectures such as Yahoo!, Skype, Reddit, Instagram. 3) Markup languages are simply syntactic conventions of text annotation that can be easily read by humans, but also employed by programming languages. A true open source repository will have all its knowledge codified in this manner, as to allow for easy, automatic, transfer/search/selection of textual code, between various platforms, through API support of such codes. a. XML, on which HTML and RSS are built, has become the default format for most office-productivity tools. The standard is so popular, that in some jurisdictions it has become the public repository format by default46. Standard formats such as this are also the ultimate battleground for corporate control47. b. Json, derived from Javascript, utilized for data serialization, as a more effective alternative to XML, between a server and a web application. Timeglider uses both XML and Json, but in different manners, the first for direct HTML representation of exceptions and the second for a true archiving of data. 4) Server client, Lighttpd, open-source web server optimized for speed-critical environments, potentially allowing for very large numbers of users to connect at the same time (10K per second max 48). Popular user sites such as Youtube or The Pirate Bay use Lighttpd. 5) Operating Systems or OSs need no real introduction, being a ubiquitous part of daily life. a. server side, GNU/Linux Debian is the preferred format for server architecture, because of its low resource utilization. While working mostly through a terminal console, it can be customized with a Graphical User Interface or GUI. It runs on the Linux kernel, being a completely free software. b. client side, Linux, Windows, Mac OS, etc, because the technologies described above are cross-platform, users are free to engage the Knowledge architecture from a variety of devices and environments.

It is part of the open formats supported in the UK, http://www.cabinetoffice.gov.uk/sites/default/files/resources/open-source.pdf 47 http://arstechnica.com/uncategorized/2008/10/norwegian-standards-body-implodes-over-ooxmlcontroversy/ where Microsoft pushes its own format of XML, the OOXML, on terms that Richard Stallman, father of open source, strongly protests to http://www.gnu.org/philosophy/no-wordattachments.html 48 Enabling a structural weakness called DDos deficiency, which means that on more than 10K connections per second, the server will become inaccessible. Distributed denial of service attacks are a common feature nowadays. 17

46

Knowledge architecture for socio-economic policy analysis

6. Multi-paradigm programming languages, such as C and C++ which are used for programming most of the other languages, programs and also serve as a basis for hardware design. They provide an indirect link to the mathematical syntax that enables very complex IT processes. By their position within the general system, Taipale also identifies: A. Data enabling tools. Some other examples of middleware include analytic filters, contextual search, semantic programs, labeling and wrapper tools, content personalization, B. Data restrictioning tools 49 for selective disclosure such as Digital rights management DRM, anonymization, subscription and publishing tools. C. Safety features, such as technologies for accountability and oversight 50 include authentication, authorization, immutable and non-repudiable logging, and audit tools, among others (Taipale, 2004).

Open source51versus closed repositories


In his model Taipale advertises the use of protected data repositories. What he forgets to mention is that security models can and do interfere with ease of access in even monitoring policy generation. This paper wouldnt have been possible without open source. Not only did it rely heavily on open source infrastructure (such as Debian OS, Liquid Feedback PHP, Lua and C, Timeglider Javascript, HTML, XML and JSON, etc, etc), but it also relied on free knowledge repositories, starting with the ubiquitous Wikipedia (for very fast topic identification) to Github and stackoverflow.com (software). While outside power depends on level of human organization, conceptual power on the net is more about standards settings. In the open source model, one does not control the outcome, just creates the tools and tries to work with the emerging community to develop them. This was the case for every single major project since Richard Stallman, created the open source concept. Its not a perfect concept, as I recently had the chance to observe,

Which personally I dont recognize as part of Knowledge Architecture, but are a definitive feature of a Policy Management Architecture 50 While I dont disagree with Taipale on the need for security, some of the language he uses makes me cringe, such as international and national information policy and law will be reliant on technical means of enforcement and accountability through policy appliances and supra-systems authorities. Still the same author goes on to say that control and accountability over policy appliances between competing systems is becoming a key determinant in policy implementation and enforcement, and will continue to be subject to ongoing international and national political, corporate and bureaucratic struggle. Transparency, together with immutable and non-repudiable logs, are necessary to ensure accountability and compliance for both political, operational and civil liberties policy needs The development, implementation, and control of these mechanisms as well as the development of the governing policies must be subject to wide-ranging public discourse, understanding, and ultimately consensus (Taipale, 2004). 51 Both open source information and software. 18

49

Knowledge architecture for socio-economic policy analysis

when a sufficient developer community failed to form around Timeglider and the makers pulled it out of the MIT license52. The web browser is an integral part of the knowledge structure, acting as semantic selector at web level, just as the policy topic search will be for the LQ platform and the integrated search and legend functions are for Timeglider - 3 levels of search into our aggregated data (Web, website, timeline) just for visualization53. However, as soon as one starts manually indexing information, he realizes an obvious limitation of the browser search function, as it cannot search structurally non-indexable data - the deep web (such as dynamic pages, locked repositories or simply poorly downloadable content, like the excruciating amount of PDFs the EU institutions post online54) and the no web (data that was never meant to be shared in public, such as the one available only through the Freedom of Information acts55). Its hard, grunt work, which requires the work of many to put into proper context56 . Because of the nature of my work, I was spared having to use protected data repositories57. If I cant actively link to them from my website, theyre useless, dead and hard
Refer to my communication with the Timeglider architect, Michael Richardson, where he points out that The major users of the widget did very little to contribute to code or documentation. People are accustomed to libraries being available, and don't realize that open source is by necessity a community effort Some developers simply took our widget and (without any real contributions to code) began building businesses around it apps that were very similar to Timeglider (Annex 1). Also as you can see, unlike the academic explanation, it only took me only 6 paragraphs to explain the idea to a concept/programmer/domain developer, because of his previous involvement with the public domain and similarity of business development. How can we call ourselves economists when the basic tool of our trade, information, namely of an economic nature, evades us? I'm choosing to interpret his last statement as a confirmation of support, so I'm going ahead with using his widget. I believe that his genuine intent was to offer a product that contributed to projects such as mine, so in that sense theres a degree of satisfaction for the Timeglider author in associating their name with my concept. 53 There are also a series of semi-automatic search functions, such as code validators, for example http://jsonformatter.curiousconcept.com/ or the various other integrated proprietary functions for editing spreadsheets and text documents. 54 Its almost like governments intentionally make data hard to index, since anyone building a webpage knows what search engine optimization is. There just doesnt seem to be a coherent policy for end-user information. Is almost as the end-user is relegated to second string, in the care of his higher entities, nation-states. EU institutions are too important to deal with the end-user directly, even thou their constitutional chart obliges them to do so. The data gathering through manual web crawling was incredibly boring, being made even more frustrating by the intentionally inserted limited features, but we need to move away from the generic Wikipedia approaches and create specialized repositories for socio-economic policy analysis. 55 Most of this data was by design never meant to be found. Its strange how from my original concept and intent, I find fundamental interoperability problems at the level of the socio-economic information structure, which is considered to be critical to both the information of the political unit (the citizen) and the functionality of the larger international/union network. Hiding information beyond a faade of protectionism is pointless and it shows how little the bureaucratic mind really knows about information. 56 Compare this inefficiency with a modern search engine - Without an index, the search engine would scan every document in the corpus, which would require considerable time and computing power. For example, while an index of 10,000 documents can be queried within milliseconds, a sequential scan of every word in 10,000 large documents could take hours. Part of my idea is to have the community do the scanning work for all, so that the individual search can be solved through an easy visualization search. 57 Open source and knowledge initiatives tend to be very easy to access, by design and intent 19
52

Knowledge architecture for socio-economic policy analysis

to upgrade pieces of information, becoming more obsolete as time passes. On a site that should be accessible, like a governmental one, not upgrading info is a cue, either to indifference 58or to lack of funding. But in closed repositories, the design forces data to become obsolete, so why contribute to such a process? Ideally, the information flow should follow: Knowledge repositories search exposure critique debate decision making recording rehydrating/enhancing/adding value to information

Hey there Paul

I really appreciate your email. I completely sympathize with your overall feeling regarding the open-source world, and our decision to "close" the source down to less-than "open source" licenses. I think once Timeglider is more established, we will be able to afford to open-source our core widget. For a year, I did have the widget out there under the MIT license. Basically: The major users of the widget did very little to contribute to code or documentation. People are accustomed to libraries being available, and don't realize that "open source" is by necessity a community effort. Only a couple generous developers provided feedback or actual code amounting to maybe a dozen lines of code. Some developers simply took our widget and (without any real contributions to code) began building businesses around it apps that were very similar to Timeglider. A company offering the core of its software as an open source widget has to be in a very strong position and has to see many benefits in doing so. We definitely want to grow to that point, but meanwhile, we need to throttle the license. We still have a lot of companies developing with the widget, both commercial and non-commercial. Good luck with LiquidFeedback: it seems like a very cool project. Cheers, Michael

Fig. 5 A copy of the 19 August 2012 e-mail conversation with the Timeglider lead concept architect, Michael Richardson, which sees me morph into an open source activist somehow.

co-founder, lead developer www.timeglider.com michael@timeglider.com cell 208.850.8512 twitter @timeglider 58 I quote from Rick Falkvinge, the creator of The Pirate Party Almost all the worlds new creators are

already working in the new paradigm; creating despite the copyright monopoly, rather than because of it those laws can and will change <tehnikpaul@yahoo.com> wrote: On Sat, Aug 18, 2012 at 11:53 AM, Paul Suciuas the 250 million Europeans who share [a free information] culture
Hi, Michael, voters. As these people start writing laws, they can and will kill those monopolies at the stroke of a While browsing the net like every penniless user, first I saw this https://timeglider.com/jquery/?p=intro and thought to myself what an excellent "It is an overriding imperative of the American democratic system that http://timeglider.com/how_it_works.php and I'm user friendly idea. Beautiful software, free of charge and excellent explanation. Then I saw this we cannot have our public sure othersdocuments lockeddisappointment. It isof proprietary format, perhaps unreadable work, however was subject other way to do have explained my up in some kind only fair that you guys should profit from your in the future, or there no this than going fully proprietary? to a proprietary system license that restricts access" things are moving there already (2005). As a regular user, I became horrified when Youtube introduced compulsory commercials, more so when Google Maps announced it was going to 20 charge a fee (only to industrial users, but still). Microsoft is locking Windows 8 with its software store and Facebook is flooding people with crap

come into power. 250 million people is not an adolescence problem; it is a power base of 250 million pen. As this declaration by Eric Kriss, Secretary of Administration and Finance in Massachusetts, proves

Knowledge architecture for socio-economic policy analysis

Liquid feedback

It is not often that one is terribly involved in a conceptual process taking months only to find out that the ideas articulated within have already taken shape. Whilst on my own I had started to realize that it was possible to articulate social change by means of highly interactive/dynamic web pages that facilitate user control and group consensus (such as through the ubiquitous PHP, the definitive public community technology), a German team already had a viable project in the pipes since the second half of 201059. Claude Lvi-Strauss said that the process of creating something is not a matter of the calculated choice and use of whatever materials are technically best-adapted to a clearly predetermined purpose, but rather it involves a dialogue with the materials and means of execution (Lvi-Strauss 1974, 29). What about using materials that were made by a third party; especially in the case of such a complex process such as policy analysis/transfer? Well, undoubtedly the design choices 60and the implicit purpose of Liquidfeedback have had a significant impact on the way I chose to design my own project, as the software speaks from a position of authority/benchmarking in respect to IT enabled decision making. In a hegemonic dominance stability system, you have a top to bottom policy generation model and the IT architecture will reflect that, as in Taipales case. But with the recession hitting and the breakdown of faith in stability in the hegemon, the dependent individuals/citizens will become independent decision makers61. We must remember however, that we had a symbiotic collaboration with the now weakened hegemon, which will move to restore the status quo (restrict the network ability to generate policy, as we can see in a series of modern pieces of legislation at global level 62), therefore it is essential to move from the simpler social networks (trend setters) to specialized ones that permit the expression of crowd policy at such a level of quality that it begins to alter the hegemons paradigm63. One must not understand the hegemon as an enemy, rather than a structure with an established usage
The reason that I had missed it was partly because the project was in German, directed towards a German audience and partly because the academic environment moves at a crawl when compared to the speed new technologies appear and should be diffused. The traditional approach to research takes forever, and as we have seen from my earlier messenger lists/e-mail/e-forum investigation, it takes about two years for stuff to sink in and papers to be published. Even now the group barely publishes anything in English besides some introductory stuff. 60 The team behind the platform has experience in data base solutions: like enterprise resource and planning, point of sale applications, reservation systems (Nitsche, 2012). 61 willing to organize themselves adhoc (we notice a rise in entrepreneurship, due to social/personal necessity in periods of crisis, after the failure of the social contract) into the simplest and most convenient form, that of an amorphous network, which can begin to organize and generate its own policy. 62 http://www.forbes.com/sites/larrydownes/2012/08/09/why-the-un-is-trying-to-take-over-theinternet/2/ 63 by that I mean what the LQ platform intends to do, to find social levers within established political institutions, especially in high influence ones, such as parties, where of course they will begin with the easier to influence members, the lower castes 21
59

Knowledge architecture for socio-economic policy analysis

offering opportunities and challenges. That is why its essential to do two things to improve individual control: Enhance the quality/quantity of his decision making process. 64 Enhance the reach of his decision making process65,

That is where PHP enabled participatory platforms such as PhpBB and Liquid feedback come into play. Through the mechanism of a shared decision making, we can build a community of intent. We must however distinguish between crowd of intent66 (the starting point) and community of knowledge (the middle point) as two different facets of what we are attempting to steer towards our goal of policy generation 67(production). The ultimate goal would be to build a community that can formulate not only its goals by means of this website, but also new pathways of action, such as educating its own agents of change68, after replacing the anonymous policy generating user with a community think thank policy generating user, which emphasizes participation and ultimately possesses civic legitimacy through self-representation69. Proxy voting (Fig. 10) with a Schulze method 70for preferential voting (LF Annex, Fig G) is the precise mechanism this representation is achieved in LF. Transitive proxy was first suggested in internet forums in the United States around the year 2000. Back in 2009 the growing Berlin Pirate Party wanted to perpetuate the chances for every party member to participate in both the development of ideas and decisions. And they thought transitive proxy voting could be a promising idea. From there the team started a democratic proposition development process and preferential voting (Nitsche, 2012).

This can be satisfied my either providing the individual with higher quality/volume data input (to create his own opinion) or to expose him to higher quality/volumes data structures (community work), which he can adopt or enhance (through debate and research). 65 again this can be done in a simple manner, by enhancing the penetrating power of his decision, either by creating a front of action, through association, through creating the right context for diffusion of his idea, if valuable and through allowing direct interference over the agenda setting policy activities. 66 Freud, mass psychology 67 Intent without knowledge is blind and knowledge without intent is lame. 68 Fish (1980) called this interpretative community 69 Talking from an analogous position, let me put it this way, if Liquidfeedback is the scalpel able to cut through the tissue of society, then my addendum should function as its eyes, and give it the ability to identify the best spot for an operation without leaving a gap. This community should be able to learn and self-actualize itself continuously and as I mentioned before, a blind community of intent is insufficient for policy analysis. 70 Clone proof Schwartz Sequential Dropping (the Schulze method) allows for the expressing of preferences (when favorite doesnt win vote moves to another preference). This is done to ensure that the user vote counts and that votes do not get wasted, by variations of the same idea that exclude each other from the top position. Apparently the Schulze method algorithm is vulnerable to an instability that generates a less desirable outcome for players, in case of a majority of strategic voting. While I could go into Game Theory further, I will restrict myself at mentioning that its important to mention that by improving decision making in a community of intent, by upgrading its capabilities through a community of knowledge, we might be able to reduce that particular vulnerability. 22

64

Knowledge architecture for socio-economic policy analysis

Fig. 10 Proxy voting representation - behind the punctuated line are the direct vote proxies and individuals (GNU license image, Wikipedia)

Current users of the platform include: a. The German Pirate Party (political party, 7% Germany representation) - for strengthening the power of individual members in decision making71 b. Friesland (formal region of Germany) wants to adopt the platform for civic representation and are currently working with the LF team to modify it72. c. Slow Food Germany 73(ONG) for managing internal policy d. Private instances such as the one I aim to set-up74.

There are differences between the chapters when it comes to actual usage. In Berlin LF is part of the statutes which underlines the importance for the Berlin Pirates while on the federal level the function is less defined. Apparently (on the federal level) LF helps the members to get an idea of which propositions can get a majority within the party. Some board members declared they decide based in LF results and LF seems to be helpful for the preparation of party conventions as many propositions are prediscussed and maybe enhanced in LF. 72 That involves sync the timing of certain initiatives in LiquidFeedback to the political processes (Nitsche, 2012). Already the platform has spread to the Netherlands and US and has been translated in over 10 languages. 73 The unusual thing is that with the platform starting as a political representation from multiple sources and competing platforms such as Adhocracy (not open source, emerging as LF forces change in competing entities) emerging, should the system actually catch on I envision a period of competition, congruous with Taipales (2004) subject to ongoing international and national political, corporate and bureaucratic struggle. 74 Thou I dont have any excessive expectations, either pragmatically or ideologically. 23

71

Knowledge architecture for socio-economic policy analysis

Liquidfeedback preexisting functionality75 As mentioned before, the project conceptual/building process and its ultimate functionality are intimately and inexorably linked. Liquidfeedback was built on top of a/to support a fast peer feedback loop76, which is why its biased towards speedy efficiency77 . My approach stems from an academic/theoretical perspective, raising the need for the improvement of public policy discourse. It matches the political side only in respect of generating a legitimizing political experience. Also, unlike the established PhpBB software discussed before, Liquid Feedback is a rather recent addition to open source and its increased versatility comes at the cost of working with an insufficiently tested software. This raises the issue of security/accountability, especially for a platform already employed in civic representation (Friesland). For a full functionality disclosure, see the attached annex or the demo website at http://dev.liquidfeedback.org/lf2/index/index.html (direct link) or http://liquidfeedback.org (foundation link). The typical LF user level can: start an initiative (proposal), that becomes an issue (to which people can add other proposals). When one proposal in the issue reaches a certain quorum, it is considered worthy of further debate and moved to the top of the discussion list. Then theres another period of debate, after which users vote on the most popular initiatives (as the ones with little representation get eliminated from the voting process to save users time). establish an agenda support existing initiatives (counting acceptance) suggest enhancements for existing initiatives (syntagmic debate) analysis through debate start alternative initiatives (paradigmic debate) vote on all available alternatives in the end of the process transfer his vote to make it count for his own wing in a given issue decision making can delegate authority of vote/debate to a delegation which can be seen as a transferrable power of attorney for both the discussion process and the final voting. The delegation can be made by unit, area or issue and be revoked at any time. Regardless of existing delegations a member can participate in a discussion and/or the

Further details in the LF annex That tends to generate Twitter style opinions 77 So fast in fact that it risks eliminating essential issues and it vulgarizing the policy analysis process. It can easily simply become just another arena for rapid information exchange and no majority critical thinking. If I can persuade you with my wit, I got your vote, regardless that the issue has found a satisfactory finality or not. It is a mirroring of party politics, and it was designed to serve the interests of a real life party. It deals in pragmatism.
76

75

24

Knowledge architecture for socio-economic policy analysis

voting which disables the delegation for the given activity. A proxy cannot vote in the presence of the principal (Nitsche, 2012). Security - Member access registration is restricted: by code (password invitation) that defines only one virtual instance for an individual. The control password can be withdrawn to members who leave the group/party, etc. civic registration, on which the developers are currently working that would assign individual document data to the unique virtual persona to guarantee only one vote per person.

By reducing moderator control, the creators have put a lot of the responsibility in the hands of the community to self-moderate itself. This is done in hope that the crowd control of policy generation at site level will ensure accountability, trust and system legitimacy78. This has been done before, although not intentionally I suspect, but as an emerging process in insufficiently defined frameworks (such as the original Usenet79). The program is designed to operate even in the case of non-collaborative crowds, that refuse delegation, by automatically sorting out policy alternatives (and eliminating the least preferred ones) - large groups with real conflicts using strict rules in a predefined process without moderator interference. Self-moderation protocols include: A. The bubbling system80 of allowing the best topics to take top billing insures that the alternatives appear in order of popularity. B. Transparency insures individual user accountability C. The Proxy system ensures system moderation, as their support for any initiative will likely bring it on top and to be a good proxy, one must be an implicit moderate for a category. The system even has an expiry date for dead user support. In conclusion Liquidfeedback is: Offering functionality that matches our intent (especially the proxy voting/Schulze algorithms). Solving theoretical/pragmatic middleware questions

Again, its a feature that came about by design. In following the philosophy of getting the community to decide for itself in the real world, they went a step further and ensured self-determination at site level (in a mirroring of the real life process). 79 User moderation protocols have been detected as Gaming theory has been employed to study politics through the use of elaborate Usenet simulations and the precursor of the modern forum has shown three different behaviors in conversations: Unregulated environment, anarchy 4chan style Regulated environment, protocol Self regulated, netiquette FB style 80 Similar to Reddits system for topic ranking by community vote 25

78

Knowledge architecture for socio-economic policy analysis

Denaturizing our intent, because of design/target intent (a rapid decision making platform for a political party) Not offering essential knowledge functions, forcing API and community protocol developments Still just a base, an incomplete design of bare bones81, which requires a target audience

Im now going to assume that the accountability/security and decision making processes for my knowledge architecture are covered through the LF platforms middleware and move on to describe the knowledge enhancing processes I propose for the policy generation process.

Knowledge process modification proposals for the LF platform Through the facilitation of LF we have our community/crowd of intent. Now what we must do is support this community with the necessary tools as to also turn it into a community of knowledge. Knowledge and intent are what assures us of producing a quality final item policy. Once again, while the LF software offers a trove of opportunity to the political individual, for the academic researcher is a rather poor proposition, as the level of communication is no better than on any other forum and the individual users might feel delegitimized by being corralled through the 10% quorum (for issue to initiative upgrade of proposals) and the 50% of quorum (validity of winning issues) requirements. Complexity is definitely an issue here, as most users are used to either social interaction (Facebook), trend following (Twitter, Yahoo) or exposure articles (Wikipedia) in respect to topics of interest. What I dont want to do is stifle the creativity of a few by enabling too much moderation82. Imagine a site that grows in complexity not just on a linear fashion, but in a network manner that aims to harness specific processes of the human mind83. I also wish to avoid having untrained individuals lose themselves in a bad process84, and create a new plutocracy of those that can adapt versus the average user. The means to achieve the desired enhanced platform functionality for the Liquidfeedback decision making software is by using the new Application Programming

forums become more and more complex and require similarly comprehensive forms of analysis, we notice either tighter control on the discussion topic or quality degeneration of the discourse. 83 Twitter for example has such a bad opinion of an individuals capacity for retention, that it has restricted its feed messages to 150 characters. After that, the average user is considered spent without further input and the building of vertical content, which reactualises itself. Since the format is very popular, they were definitely right. Wikipedia on the other hand has no such issues, developing a horizontal model aimed at the rare encyclopedic user or much more often at the occasional user (likely student). While it has a vertical process of peer review, that is restricted to the editorial side and never seen by its regular users. By inference, one should suppose that the minds behind twitter do in fact utilize complicated horizontal forms of communication that they do not in fact make accessible to the wide public. 84 Jonassen (1997) stresses that "well-structured learning environments are useful for learners of all abilities, while "ill-structured environments are only useful to advanced learners. 26

82

Knowledge architecture for socio-economic policy analysis

Interface85, which LF supports with release 2.0 86and by developing specific standards of presentation/community supported protocols within the platform. The additional functions for enhancing Liquidfeedback that I propose are87: A. B. C. D. E. F. G. H. I. J. Enhanced visualization timeline (preventive role) Enhanced visualization exposure draft protocol (imperative role) Semantic search better search function Semantic clarity dictionary Enhanced search elapsed topics tree structure Peer-to-peer communication direct messaging window Proxy suggestion box - through the direct message system. Community-to-peer communication RSS feed window Community creation enabling circles Generally enhancing user profile with vote statistics, historic, etc.

Visualization88
The need for visually representing knowledge is nothing new and attempts have been made over the years to improve IT platforms in all sorts of technological experiments. Without realizing it, you, as the end user, are currently enjoying some of the best social/commercial designs out there. Visual IT representation 89started when in the 1980s when SemNet produced threedimensional graphic representations of large knowledge bases to help users grasp complex relationships involved. The design of SemNet focuses on the graphical representations of three types of components: identification of individual elements in a large knowledge base, the relative position of an element within a network context, and explicit relationships between elements (Chen, 2002). The Internet representations of .net domains became legendary and inspired a whole range of consumer accessible relational representations such as this Facebook module that allowed me to see my FB social circle90. The first step in pragmatic policy analysis is to identify a problem. This is quite easy if the problem is urgent or imperative, but you wouldnt want every single issue you deal with to
Through the API interface, other pieces of software can be connected (with some programming). http://dev.liquidfeedback.org/trac/lf/wiki/API 86 Support is also provided at official developer level, by registration here http://apitest.liquidfeedback.org:25520/ 87 None of this is truly original, but why reinvent the wheel. The most common of these functions will likely be addressed by the larger community, while I focus on the extra visualization ones. 88 "Visual representations and interaction techniques take advantage of the human eyes broad bandwidth pathway into the mind to allow users to see, explore, and understand large amounts of information at once. Information visualization focused on the creation of approaches for conveying abstract information in intuitive ways." (Thomas and Cook, 2005) 89 based on a series of mathematical models for knowledge representation such as Information Retrieval Models, Bayesian Theory, Shannons Information Theory (Im familiar with it from my Communication BA), Condensation Clustering Values (with which Im familiar from my previous clustering research) etc. (Chen, 2002). 90 http://mashable.com/2009/08/21/gorgeous-facebook-visualizations/ by no means the only option 27
85

Knowledge architecture for socio-economic policy analysis

become an urgent matter, for lack of being addressed. You would want to be prepared and therein lays the difficulty of identifying your preparedness for a potential future issue in a massive corpus of insufficiently formulated policies. So how do you identify a problem? It seems that in the Twitter era the first guy to yell fire is the problem finder and then the whole heard will either run towards there with opinions and as far away physically as possible. But it is better to prevent than to cure, that is why the ability to monitor a situation is essential. That is why a good visualization charged with as much data and metadata possible, yet easy to use and convenient is desirable.

Fig. 11 My Facebook social circle representation courtesy of the MyFnetwork application. The clusters are middle school, high school, BA, my first MA, various countries and jobs

Common wisdom says that 2 brains are better than one, but how about 20 million, 200 million? Of course, the software Im working on can only hope to contend with users on the order of 5-20K, yet still even these numbers are vastly superior to the limited commissions set up nowadays to identify policy issues and set agendas. Especially, when these tend to be the formed of highly subjective individuals, with a personal degree of interest in the matters they supervise. Crowd policy monitoring is my attempt to popularizing a higher level of policy awareness as opposed to mere opinion bubbling expressed through Reddit and Twitter.

28

Knowledge architecture for socio-economic policy analysis

Information visualization, or in other words, visual data analysis, is the one that relies most on the cognitive skills of human analysts, and allows the discovery of unstructured actionable insights that are limited only by human imagination and creativity. The analyst does not have to learn any sophisticated methods to be able to interpret the visualizations of the data. Information visualization is also a hypothesis generation scheme, which can be, and is typically followed by more analytical or formal analysis, such as statistical hypothesis testing. (Anonymous Wikipedia editor91)

Timelines92
Chandler (1995) recommends as the best method of text analysis the detailed comparison and contrast of paired texts dealing with a similar topic according to syntagmic/paradigmic principles. But representing a policy matter requires more than mere narrative or tree structures. It requires consideration of overarching (corpus of policies) and underlying issues (token component analysis, definitions), the ability to move into detail (EDs, abstracts and links) a time dimension etc. Enter the most intuitive93tool for complex/time dependent issue visualization and comparison the timeline, a complex structure over which interested individuals can browse and identify faults or opportunities for improvement. Simply put, timelines allow for the ordering of more of every type of data within the same seaming visual field. For example, on your PC monitor you might be able to have a few paragraphs and a couple of topic titles at the same time. On a timeline, youll have 100-1000 topic titles at the same time, arranged in a time fashion, with various visual cues and colors for easy identification. Simply put, comparability at community level is enhanced because:

Comprehensive topic visualization, volume, color codes, font choices, etc.


Preservation the temporal value of data, customarily lost when data is shown in an ED format, with the preservation of semantic and observable connectors which allow am insightful user out of the thousands watching to raise an issue before it happens

Logarithmic timeliness include an additional parameter, that of information novelty, which means a dilatation of time as we move from present, both in past and future, with less detail being exposed for the past and less prognosis for the future94.

http://en.wikipedia.org/wiki/Information_visualization#cite_note-3 origin idea came to me from Encarta Encyclopedia where human society was structured historically according to selectable topics (everything that the current Timeglider can do), thou the idea came too late and Encarta was eliminated by Wikipedia, who has yet to implement such a system. 93 Historically speaking there are some fascinating examples as shown at http://www.cabinetmagazine.org/issues/13/timelines.php 94 a level of sophistication that our current Timeglider software doesnt support, but indirectly through resizing
92

91

29

Knowledge architecture for socio-economic policy analysis

Creation of a usage protocol, such as expectations of format (every topic having a body of text) The creation of a dictionary, as interpretation is paramount with many internationally reaching laws/agreements seem to lack it

Also, timelines because of the sheer volume of condensed, community linked data offer an unexpected opportunity to site growth through harmonization of intent with the greater network that the site is a part of95. The PageRank algorithm instead analyzes humangenerated links assuming that web pages linked from many important pages are themselves likely to be important. The algorithm computes a recursive score for pages, based on the weighted sum of the PageRanks of the pages linking to them. PageRank is thought to correlate well with human concepts of importance. As we can see the current practical implementation of link technology and analysis96 , is not based on academic standards of citation, but on a user requirements need, of where policy organization is part. Time to move from the 20 citations author model to the 2000 free internet links model and beyond, to the community enabled 2 million one, in an attempt to create original and genuine solutions for pragmatic problems, which the sooo detached theoreticians seems to constantly ignore until one of them brilliantly points out a paradigm shift (in simply stating the imperative obvious).

Exposure draft 97
The first issue that jumped to my mind when observing the LF functionality was the rather poor interpretation of specific issues offered by the end users, in a complete misunderstanding of a policy operational steps. Issues were being proposed by people with good intentions, but without the necessary ability to articulate them. As such these issues were the result of an opinion, which similarly inclined individuals will likely follow without giving thought to a proper solution. This type of patch work solution that presents itself seems to be a direct result of imperative, immediate events, situations that bridge into real social problems that the IT community feels obligated to address. But obviously a community cannot be managed only through imperative direction.

WWW topic input and providing links/outside imagery (hopefully of the updatable type) Module Abstracts Topic names Timeline name for initial input into the human browser, after which the human browser will reverse this trend and in deepening knowledge will enhance the existing information loop, at the same time others do. 96 Link analysis is a subset of network analysis and provides the relationships and associations between very many objects of different types that would be impossible to observe from isolated text pieces. 97 Presentation of an item of policy for the public. IFRS terminology, thou the original idea came to me from traditional encyclopedias and academic journals article presentation 30

95

Knowledge architecture for socio-economic policy analysis

The ultimate model of exposure draft presentation has to be the Wikipedia model, in itself a massive repository of Exposure Drafts and model of development. This ED should be joined by a critical assessment tool/commentary such as the commentary option offered by the MS Office tool for text, which would offer the community the chance to amend the text of a proposal with suggestions.98 Fig.12 An example of the Word comment function

Dictionary99 The existence of a dictionary100 binds people to a shared understanding and stops individuals at the semantic level of discussion when the terms do not coincide, eliminating dissent at later stages. Ensuring consistency of approach In natural language processing, semantic compression is a process of compacting a lexicon used to build a textual document (or a set of documents) by reducing language heterogeneity, while maintaining text semantics. As a result, the same ideas can be represented using a smaller set of words. Semantic compression is advantageous in information retrieval tasks, improving their effectiveness (in terms of both precision and recall). This is due to more precise descriptors (reduced effect of language diversity limited language redundancy, a step towards controlled dictionary) (Ceglarek et. all, 2010). Topic delimitation is critical as semantic incongruence can lead to a never ending amount of debate between individuals who share complementary negotiating positions. Even with community support the amount of work in operating with the taxonomies is so large that I hope it is possible to utilize some preexisting conditions, in the form of web dictionaries. There just has to be a community accord on the exact definition and optimal dimensions of it. As of now, this particular topic requires a further investigation.
original idea, Microsoft Word, because its an awesome idea inspired from editorial reviews. While Liquidfeedback has a suggestion system, it seems limited by comparison 99 original idea, from the law, where everything has to be explained in detail to avoid litigation, where the formal language code creates convergence and enforces uniformity and consistency. 100 which should include tools and protocol descriptions 31
98

Knowledge architecture for socio-economic policy analysis

The sortable/search function at site level has to be improved as to allow for proper topic selection, be they expressed through the ED, the timeline or the dictionary. Some potential ideas might include topic tree navigation, thumbnail selections (commercial site style), better semantic tags for identifying policies (which could possibly enable an automatically tag populated timeline), individual user and circle search.

Site brand
To enhance end user platform experience and confer him an identity we must link him to a community with a clear purpose/image. From the possible implementations of the platform, I think the only private one I can manage at this point is the Think Tank implementation (quite weak and ineffectual, self-defeating101model from what I can imagine right now), as I hardly possess the legitimacy to create a civic representation website102. Tying user virtual identity to the external institutional one is of great concern to LF as the platform, through the processes it seeks to moderate, is not interested in the individual user per se, but in the institutional side of said individual. The collected IDs need to be homogenous, for reference and comparability, which already means clear belonging to an outside group (citizen, party member, etc). Its a feature enforced by design functionality, which means we are not dealing with a fully scalable social network, but rather with a mirror network. This limitation is essential for purpose definition, unlike in my original idea, where knowledge acquisition was intended to support any and all users. A very different branding system is required here. Therefore, while in my original proposal I was already envisioning a web domain103 , as I got closer to understanding the LD platform and my modifications, I have moved towards creating a technical instance only. As I want to create a proper website I must raise the question of how can I safely identify and attract the most suited users to create the critical network mass 104that the process requires. It's a question I'm definitely looking forward to answering will be asking in my own search for PhD support105.

It would end up as a heavily loaded version of Reddit, by losing its USP, the decision making model (without pragmatic effects, it would just be an impotent tool). 102 What I can hope for in the long run is that my theoretical work will be recognized by the emerging developer community and I/others will end up implementing some of my ideas at some constituency/institutional level. That implementation would also help support the general LF platform which, in its current form, desperately requires academic validation (before its flaws lead to it being publicly dismantled by on the hire theoreticians). 103 Because of the LF decision making process, my original branding proposal, deconstructingeurope.com would have to be modified. 104 The community/website needs enough users (nodes) to grow above a certain critical mass/cost breakeven point of production/connectivity if it is to reap the benefits of my added modifications. Classical economies of scale are on the production side, while network effects arise on the demand side [percentage of online users from the intended institution]. 105 How will the ultimate product address the Knowledge architecture process? 32

101

Knowledge architecture for socio-economic policy analysis

I would also propose the adoption of a popular site ethos, represented by general principles, such as Joshuas 7 laws of economics106. These would definitely need a bit of cleaning and likely subjecting to community accord, as they tend to constitute themselves in a sort of constituting document of the website. Attracting users to the platform should also be pursued through acceptance/naturalism of the code. While theres a need to make policy visible from a deconstruction perspective, theres also a need to make the process of analysis/debate highly natural, organic, not felt by its users, that is to teach people and then make the method invisible, fully accepted. Making the interactional process as smooth as possible will allow our initiative not to break character and be accepted by the user, at least in its technical/design implementation. Enabling outside reach because the target users are highly politicized individuals107we must ensure they can exercise a resemblance of executive power, after consensual decision making108.

http://www.joshuakennon.com/joshua-kennons-seven-laws-of-fair-economics/ Which regardless of their ability would give their time to engage political subjects 108 Linking the LF platform to a party would have ensured that by default, but options must be found for a Think Tank system. Some means to achieve this would be: social site integration to popularize subjects group wide initiatives such as online petitions, even nominating physical delegates/ activists eventually the community should be able to develop a member support system (scholarships, bursaries) that would go as far as to train individuals for particular roles (community lawyers, etc.).
107

106

33

Knowledge architecture for socio-economic policy analysis

34

Knowledge architecture for socio-economic policy analysis

The project work logs


Project calendar109
1) Some of the stages that I had originally envisioned were: a. supporting the vision with theoretical support, (January-February), Research proposal b. defining the concept (March) c. implementing the concept (April to July) d. describing the project academically (August to September 17) 2) However, the actual calendar turned out to be: a. supporting the vision with theoretical support (January-February), Research proposal b. defining the concept (March) c. finding that the concept was insufficient after more literature review (April) d. researching the LF platform and realizing a 2.0 version was to be launched in 28 June (May) e. more Knowledge Theory research and ACL leg surgery (end of May) f. waiting until 28 of June for LF 2.0 launch (English and API features, June) g. monitoring LF functionality and refining the Knowledge Network concept (July) h. [not so] friend comes to live at my place for free and bugs me for two months (July-August) i. the LF platform functionality comes at the price of difficulty of install (July) j. after another 5K master and surgery period, Im broke and have to start to a full job (August) k. start implementing the theory for Timelines, working on Timeglider (August) l. start putting together a paper from 100K words spread across various notes (September) 3) At the present time, the next period of implementation should be: m. finalize the LF implementation on new server (September) n. finalize API programming and testing on server instance (October) o. create site brand and refine social protocols (November-December)
From the start I had an unclear time limit for the complete project implementation as it required a number of very different stages, with which I was unfamiliar. For the original PhpBB architecture, with its highly automated functionality, I had envisioned at least 6 months (in January 2012) until I had a proper website implementation, but as it turned out after three months of on and off work I have abandoned it in favor of a the better Liquidfeedback platform. 35
109

Knowledge architecture for socio-economic policy analysis

p. q. r. s.

start PR job, create contacts, interest, communicate, promote (December-January) start monitoring the website functionality (January, if finished) further refine the concept for a professional iteration (January) find a job, placement, promote my design for PhD, bursary (September-forever)

Installing the LF software


To make a long story (months) short, here is the install process for the LF platform, described by my supervisor as a record the development activity a record of achievement, a lesson learnt log and a plan for future activity". 1. Initially I was planning to run the Debian OS, required by the LF platform on my Alienware MX17 PC, in a dual boot system, but after a few days, having frustrated myself with failing to install Linux, I came to realize that because of previous RAID metadata 110, meant that Linux failed to read my Hard drive properly (I installed Debian/Ubuntu alternatively about 10 times) 2. I frustrated myself installing Debian (the least friendly Linux) on another laptop (salvaged literally from a garbage heap), only to find out that the WIFI board was fried, so my work in finding the right drivers (not included in the Linux Debian distribution for some random principle) proved useless111. Eventually, after a couple of days, I managed to set up my phone as a WIFI connector, through using a non-standard command 112 for dynamic IP's, a little thing that came along since Linux was last updated properly. 3. After getting Debian to work I was finally ready to install the LF software according to its rather complex set of instructions113. After that, I came to the great realization that this is Debian, an OS famous for nonstandard command line program installations. Plus all the great repositories it's bragging about are poorly managed and I had to chase program dependencies one by one and install them manually114. This single small line of code at the beginning of the FAQ took me another couple of days:

I was one of the lucky few people in the world who got to experience a RAID controller failure on his PC about a year ago. Basically, it cant be fixed and it destroyed one of my Harddrive bays along with a HD. 111 At this point my Computer Science housemate wished me luck and ran away, unable to sort out the mess of not knowing if the drivers, the hardware or simply the Debian OS was at fault 112 Ill never forget the dhclient usb0 command as long as I live. 113 http://dev.liquidfeedback.org/trac/lf/wiki/installation an installation which took me through C++ and PostgreSQL command functionality, from things I was seeing for the first time (SQL) to things I was seeing after a very long time (C++). For technical details see the webpage. 114 Nearly 60 individual pieces of software (the installation of which is not straight forwards as in Windows/Mac) Apt-get install and apt-get update might be useful theoretically, but they didnt do much for me. 36

110

Knowledge architecture for socio-economic policy analysis

apt-get install lua5.1 postgresql build-essential libpq-dev liblua5.1-0-dev lighttpd ghc libghc6-parsec3-dev imagemagick exim4

4. After installing everything, I received a server error, which I had difficulty addressing. It seems that the LF frontend 115program dependency WebMCP had a series of issues with accessing its own Lua5.1 dependency. I tried fixing it on my own for a few days with no success116, then contacted the LF support service. Unfortunately, as I was to find out the LF support was provided by someone unfamiliar with WebMCP117, which only gave me the generic answer to reinstall it all. 5. Pressed for time, I abandoned the server instance installation and focused on creating a case study for proof-of-concept in regards to the enhanced visualization of Timeglider.

Dear Paul, please try the following components with the following version numbers: - WebMCP v1.2.4 - PostgreSQL 8.4, 9.0 or 9.1 - LiquidFeedback Core v2.0.11 - LiquidFeedback Frontend v2.0.2 Those components should work together. If you experience problems installing these components, please write another mail. It would be helpful to paste the error messages in the email rather than taking photos. That way we could read them more easily. Regards Jan Behrens On 08/11/12 10:11, Paul Suciu wrote: > Hi, guys, > > I have encountered a series of issue on installing WebMCP on my Debian > amd64 distro, as a precursor to Liquidfeedback. Since my Masters

Fig.15 Copy of the reply Jan Behrens, LF developer, provided to my request for assistance on the 11th of August 2012

The interface of the LF platform, which manages user interactivity. Mostly moving libraries around 117 While Jan Behrens is one of the LF platform programmers, the WebMCP application is maintained by someone else, whom I was unable to contact directly. I agree with Ian and as soon as I have the time I will purge the Lua5.1 libraries and reinstall a newer version of WebMCP (which appeared in the meantime).
116

115

37

Knowledge architecture for socio-economic policy analysis

Without the controllable testing instance118, I could not access lfapi (the LF test functions) which meant any API integration work done so far could not be properly tested119. I then decided that the academic paper coming in a little more than a month needed a proper visual demonstration 120and I proceeded to operate on the client-side programming.

Fig.16 A picture of my work system, from left to right, the Alienware MX17, displaying the Timeglider timeline in Chrome for Windows, the ReadyNas server, the recycled HP Pavilion DV6000, running Debian.

A short theoretical introduction to timeline visualization


A timeline is a visual narration and should have a start, a peak/volume and an end. At the same time it could be construed as a knowledge network module representation, within the greater website network and the even greater WWW. As the most important 121and least self-explanatory API module I planned was the timeline, I decided to proceed with its development. Once more, I repeat my intent of enabling the community with academic potentiality, by allowing individual users to freely populate timelines in an attempt to assist in presenting an issue through the LF platform. The simple circuit would look like this: Table data loader (an automatic conversion process) Timeglider API Liquidfeedback

To give you an idea of the task I'm attempting only to install this package, people with years of programming behind struggle and groups such as the Pirate Party of California are asking for donations so they can create their own instances, never mind modifying them. 119 For a while there, I wasnt a theoretician anymore, but a domain developer, coming to grasps with his programmer skills limitations and trying to surpass them because of time constraints. 120 I honestly thought I had a straight chance on doing these modifications before time ran out, but as I became bogged in technical details and academic externalities, I had to finally concede in only doing the Timeglider presentation of a case study, without too many conclusions as an example of the visualization function intended for the platforms full service. 121 Initial position at community policy analysis level, through its investigative and predictive qualities 38

118

Knowledge architecture for socio-economic policy analysis

While you are familiar with the Timiglider link to the LF platform through its API functionality, you will also notice a new component there. That is part of the user interface allowing him to insert timeline object values directly into the JSON file, as the current implementation of the widget requires some knowledge of JSON/HTML functionality and direct access to the JSON file122.

Timeglider Why use Timeglider as a plugin? Because its open source, built in the best possible manner 123and is a lightweight, extensible time-viewer-explorer which can zoom/pan and otherwise explore future/past events easily. Timeglider.com provides an authoring environment for creating hosted timelines; this plug in is meant for enterprise media, medical software, private legal workspaces, etc. all of which may have APIs of their own (Timeglider license file). Surprisingly, while the actual information for the Github download hasnt been updated in a while 124(about a year, both on the widget site 125and the developer notes126) the widget has been offered with FULL functionality127 to end users. Some of the programed features of the Timeglider widget include/will include, but are not limited to: 1. 2. 3. 4. 5. 6. 7. 8. 9. Date format, localization to user time zone Search function by semantic topic Legend by icon (allowing for corpus selections) Event attributes (about 20 different categories) Images (clusters, etc.) Audio/video files (not implemented in the version I used) Event editor (not implemented) Printing of time range or saving to PDF (not implemented yet) Import parsers (RSS, Flickr, twitter, Facebook, semantic "scraping" of dates on any webpage) 10. Custom modifications such as embedding links in modal paragraphs, multilinks, tuning container/modal size for large screen display
An impractical measure for a big server. One possible option would require modifying this open source plugin http://shancarter.com/data_converter/ for Excel to JSON convertibility into a loader function similar to your Address/email/etc. filling plugin on most sites (A few prompts for Creating Loading Modifying data as an option to the LQ Frontend), suitable for spreadsheet table populating. 123 Open source Javascript widget with API integration for a wide range of platforms from design. It also has features of logarithmic timelines by being scalable. 124 While Timeglider started out as a great idea, free under the MIT license, for pragmatic needs has been switched to a paid website dependent platform http://timeglider.com/levels.php 125 http://timeglider.com/jquery/ they upgraded the widget site after my letter, apparently, but because they did it in the three days or so youre still getting the old version pictures Awesome guys, have to write a thank you note. 126 https://github.com/timeglider/jquery_widget the new version solved some image bugs 127 Had to find that out by myself while operating on the widget (also saved me a lot of time I didnt have) 39
122

Knowledge architecture for socio-economic policy analysis

Because of selective zoom, Timeglider allows for an element of logarithmic timeline, by identifying only the big issues at its macro level and then adding more and more issues as we switch in. Threshold/Interesting issue disappearing macro There was no Javascript when going into detail. Suddenly appearing color macros when going into detail. The only problem is how to structure a view in such way as to make the best macros appearance/disappearance as to better present, visualize a complex topic? One of the early requirements for the software input was that it had to be easy to populate from a spreadsheet, representing a tabular format with which most users would be familiar with. Spreadsheets are a must in data mining and any user who would have had to spend an inordinate amount of time crawling after various pieces of information, should find his job made easier, when the already formatted data can simply be inserted into a timeline128. Of course, data processed through a markup language such as JSON or XML can be retrieved in a tabular manner also 129(but a reverse plug-in must be provided), as we want the data to be easily accessible and utilizable130. Theres a real necessity for creating this tool as mark-up languages dont allow for mistakes in their body of code131. After transforming the Excel table into a JSON file with the use of the Github.com open source Mr. Data 132plug-in, the transformation was incomplete and I had to use a JSON validator 133which was a must, because of the high volume of data134 .

Provided that the necessary protocols for the variable fields have been fulfilled, otherwise the data cannot be read by the software. See the Excel file annex, which was used to populate the JSON file annex. 129 An HTML table would allow for the data to be recovered with a simple copy paste and reinserted into a spreadsheet application such as Excel. 130 Excel transferable data, simple and easy to understand for accountants, which most likely my grader will be. Anyway, in the age of twitterism, shouldnt be too hard for people to populate an Excel table, should it? After a while the Excel sheet can start to read like a timeline, if youve done enough entries into the Json file. 131 Because they are just data organizers, literally the code has to be flawless, otherwise it will not load. 132 http://shancarter.com/data_converter/ it allows for data formatting between a variety of languages 133 http://jsonformatter.curiousconcept.com/ 134 about 13000 words for 100+ entries and it had to be perfectly validated, otherwise it doesn't load. That was fun as I had over 200 mistakes which I had to correct manually. This sort of hurdle would truly discourage an average end user. 40

128

Knowledge architecture for socio-economic policy analysis

Fig.17 The exceptional moment of the JSON file validation, when I knew the timeline would open with all its cases, after many weeks of work.

Creating a timeline protocol

Knowledge visualization shares some intrinsic characteristics with cartography the art of making maps (Chen, 2002). It really does and every wrong move you make can detract from a good user experience. So there must be a protocol beyond the software imposed ones135. 1. Extend the logarithmic functionality of the timeline, by treating events according to a logarithmic points system such as assigning visibility values according to this tentative table: Visibility Population Costs Actuality City136 Geography 100 100% 100% Today Alpha Continent 90 50% 50% Days Beta Union 80 25% 25% Month Gamma Country 70 10% 10% Past year Delta NUTS1137 60 5% 5% Decade Epsilon NUTS2 50 1% 1% Century Etc. NUTS3 Etc. Etc. Etc. Etc. Etc. Etc.

Such as first row naming the variables allowed in spreadsheet files (startdate, enddate, high_threshold, id, title, description, icon, date_display, importance, link image, modal_type, css_class, span_color, y_position, etc.) 136 City classification by GAWC http://www.lboro.ac.uk/gawc/ 41

135

Knowledge architecture for socio-economic policy analysis

2. Condensing information into an efficient resume of the topic by using the Modal window. The timeline modal is an abstract sized, easy to digest mini-article, with links for further exploration and imagery138.

Fig.18 The timeline modal (abstract) with date, topic description, various links, images, etc. and can be HTML formatted

3. Insert as much policy qualitative detail as possible and less quantitative data that could be better seen by means of charts, etc. A monotonous display of same type of information gets boring quickly. One needs to create unique categories. Not mere tags, but points of interest. For example listing a chart is listed on the date of publication, despite referring to events that happened on a previous date. We now talk policy, not indexes. Interested parties, not research subjects. It took me a while and a few hundred index entries to realize that one must have a view of policy beyond the limited point to point of an economist/accountant139 . 4. Defining/separating by visual cues a variety of issues involved in policy such as journal articles, academic papers, political slips, official party policy, lobby groups, shadow

http://en.wikipedia.org/wiki/Nomenclature_of_Territorial_Units_for_Statistics (wiki link, as the main EU institutional site didnt work at that moment) 138 The HTML formatting of this page is a must in order to enable proper paragraphs (the size of 2-3 twitter feeds, as most individuals find it hard to synthetize complex text),multi links (very important for indexing purposes) and generally a pleasant/fast experience. 139 If you want to see indexes, you can always use Excel alone to generate a chart, which you can then insert into the timeline. Dont get me wrong, as in analyzing a complicated piece of policy or a policy corpus, I cant encourage enough the usage of methodically attained data (charts, academic articles and quotes etc.), feature journal articles and opinions, imagery, etc. to confer legitimacy. 42

137

Knowledge architecture for socio-economic policy analysis

policy, policy laundering, current debates, implementation, public feedback, externalities, alternatives, background, triggers, etc. 140

Case study treatment and conclusions


I suppose that not only once during this paper read you might have asked yourself, but where is the accounting/finance, or at least economic angle of this paper? Well here it is, or better yet, access the live demo from the supplementary CDs provided with the paper. Because of the difficulties encountered in explaining a novel concept141, I decided I had to create a large case/proof-of-concept to support my research. While case finality is less important that the mechanism of its analysis, I still felt I had to pick a case that covered a wide market area, both because end users will have to tackle such cases and because I was by now used to treating large scale research. Theres also the issue of social utility, which I mentioned in the introduction. As such I have decided to target the policies with the largest number of affected individuals within the easiest to understand environment and that is public policies within the EU142. That still leaves a massive amount of work to be 143done as I based my case study on a thorny EU issue, the creation of a unified labor market through the convergence of national markets. Despite it being a case with interest to a wide community, depending on coverage, the conclusions drawn in this paper will not be about social topics, but on how the data was processed by means of the website. Safe to say, however, that I do hope this first case will attract likeminded individuals, from a variety of backgrounds, who will assist me in building the site. I would call this process of getting others involved a reverse feedback loop, that is people aiming first to provide feedback because of interest and then getting hooked and building content and launching initiatives, spurring others to do the same and helping with said mammoth task. Its not that the EU and national forums dont provide information; its just that its informational structures have grown disjointed from one another, as clearly exemplified by their heterogeneous interfaces and unlinked approaches. Its like everyone is competing to be bigger, not more integrated, a typical facet of monolithic bureaucracy144.
Enough skill should eventually show policy blueprints through the timeline, in respects to functionality, scope, agents, interconnectivity, relative power, etc. 141 At least for the academic environment 142 While my comprehension of the EU goes beyond the paper union and sees it as an integrated economic union inclusive the subservient/buffer states outside the borders I felt I needed to restrict my analysis to a familiar subject for ease of explanation. 143 Since most of even the most basic data available is squandered in a sea of information on various local language government sites, it still promises to be a mammoth task. 144 Eurostat, the EU statistical unit has a wiki, a completely expositional method of presenting data. No debate, no influence. It does however establish a baseline, a centralized knowledge repository out of which some policy issues can be reconstituted. I must confess however, that until now, I wasnt aware of 43
140

Knowledge architecture for socio-economic policy analysis

The crowd can bring these divergent ideas into focus and literally identify their strengths and issue recommendations for their upgrade, merger or elimination, by accessing deep information, such as constituent individuals, effective powers, costs involved145. A unified method of presentation and common standards of assessment must be identified146. The case I chose to present is the labor market, the poster child field for socio-economic policy147. A policy timeline is a bit like a PR press release folder, where all the important issues get put into. And because my project aims serve the individual I was interested in such issues as minimum wage, retirement age, youth employment, etc. Catering to the common man makes the issue of synthetizing information in an easy to digest manner, essential148.

the encyclopedia, despite it being around for 3 years of so. It just goes to show the power of Wikipedia in drowning out competition. Crowd generated information easily trumps the best the EU has to offer in terms of exposure ad content volume. 145 Because, believe me, when I say, theres a lot of stuff out there and ironically, its not nearly good/ enough of it to run a 500 million group of people. Some of these groups have been around for years and I cant believe how little production theyve done 146 Putting together a case study on the EU labor market is a yearlong research in itself, not because its hard to fill a spreadsheet, but because its hard to chase down and decide which data is important. Exposure files often fail to follow the most basic of formats and while you can read a report that presents itself as an abstract size (perfect for a module, but useless for further investigation), you can also read a hundreds of pages report, difficult to grasp and hard to resume down after a first reading into a module. The initial image you get is shockingly complex and you strive to make sense of it and put it into perspective. Unfortunately, when analyzing any such arena one must be familiar with both prevailing realities in the field and current theoretical developments, the subspecies of socio-economic policy called Labor Economics. As I mentioned before, because I aim to task the crowd user with the job of analyzing policy, I will only build this study case as a demo model of visualization. Also, this is only the first stage of the policy analysis model, so the data might appear to be rather unfocused. What Im trying to do is to speed up and make easy market wide analysis, not unlike the type of focused research analysis I did before (despite the fact that I was clueless in many respects). 148 What youre looking at in a window are about 1000 pages of information, structured both along category tree lines, temporal relationships (determined determinant, syntagmatic/sub and supradivisions, alternative/paradigmic relationships) and logarithmic importance (novelty). Ironically, because of the form of my idea, Ive managed to break a personal cardinal rule make sure people notice the amount of work you do ;). In this case, by design, you should not be overwhelmed.
147

44

Knowledge architecture for socio-economic policy analysis

Fig.19 A timeline of the EU labor market (incomplete), proof-of-concept for visual representation of complex policy topics in Liquidfeedback

Fig.20 The newest version of the timeline, launched on the 10th of September, fixes some issues such as automatic image resizing and the general design

The great thing about my timeline is that you dont have to possess any advanced analysis skills when faced with extremely sophisticated information. Copy paste, follow the rules and the big picture will appear. Then and only then you have to make up your mind, not like in a traditional policy reading where by the end of the first paragraph you forgot what it was about149.

and by the end of the first page you need a coffee to keep you up. God forbid should you attempt to read a book, because youd find yourself years from now on an old person, regretful of a wasted life. Lets imagine for a second that unlike a specialist, I, as a regular user, dont have weeks to waste 45

149

Knowledge architecture for socio-economic policy analysis

A complex representational image forces you to frame a problem in a transparent, manageable way when designing a presentation, which means its very hard to push a bad argument without showing the holes in it even to less skilled individuals150. I wish I could say that my attempts at populating the timeline editing process showcased an easy and fast method of doing so, but Id be lying. The necessity of properly structuring data and collecting the essential bits is quite demanding, especially in the first few days until it becomes a sort of second nature. Some might ask why I bother with so much detail. It is because I want to retain and enhance the functionality of all this fantastic software, to offer added value to the whole process, instead of detracting from functionality, by simply being too lazy for an exhaustive understanding of the concept151.

because I have a busy life (and not because I only have a month until I have to present an unfinished excruciatingly complicated piece of work). 150 The Labor Timeline doesnt look like much (as day to day policy is never very exciting) but imagine if you would be interested in one particular aspect/country/period of the market, how much time would you save by being able to quickly access the relevant topics? 151 the way that most EU economic institutions do, after spending a few weeks on their websites of institutions such as the UK government, which may even deter access, by requiring Freedom of Access paperwork, many times for invisible/non-indexed documents. 46

Knowledge architecture for socio-economic policy analysis

Conclusions
Due to the modular nature of open source we have seen how it is possible not only to create fully integrated tools of policy analysis/generation (maybe even implementation), but also how these tools can be used to enable community functions that can prove to be far superior to commercial ones, because of the higher stakes and capabilities said communities possess as a whole. This arrogance of intellectuality 152quickly dissipates when one is face with the truth of his own illiteracy, an illiteracy reflected at the product level of the EU administrative community as a whole. If as social plutocratic leaders we are unable to master from the ability to formalize natural languages and create social protocols to the ability to quickly and efficiently utilize fully formalized languages in the medium of IT153, then theres no wonder that the community at large, out of sheer frustration, will and should succeed in creating mechanisms of self-governance, by exploring issues such as proxy voting154. In respects to my paper goals I dont believe there should be a common closure for what I envisioned as essentially an iterative process and Im willing to leave the some of the matter open to frustrate reader interest into action. More so, due to the vast potential of the theme155, it would have been impossible for me to provide such a closure.

While providing a visual proof-of-concept for only the initial stage of policy analysis for an already existing infrastructure, I have also shown how very complex nature of such problems both possess a challenge and demand from your average researcher skills that surpass the narrow field definitions, such as intimate knowledge of the pervasive technologies in the fields of data manipulation and user interactivity. While not everybody needs to know the complete process of crafting a community enabled knowledge network, the difficulties I encountered during the project, showed me just how removed from pragmatic implementation could be a graduate of multiple academic institutions, an intellectual by all rights . 153 Dont get me wrong, I dont really believe that individuals who only identify themselves with their narrow IT niche are more suited to policy generation. What I am advocating is a degree of pragmatic completeness. 154 A first and extremely important barrier to surpass is to get over the legacy of security mad institutional protocols and bypass the need to compartmentalize and control the flow of useful knowledge to such a degree that it becomes useless. While through its free knowledge repositories the community is slowly providing a solution to bureaucracy, the established directory organizations must themselves make every step in adopting open source methodology and support the emerging knowledge community in its goals. 155 And remember most of this time was eaten away not by the project itself, but the vast body of literature I had to review and synthetize in an academically digestible form, so that an outside viewer could follow the heuristic conceptual process. Remember that in the case of Michael Richardson, the Timeglider creator, it only took me half a page to convince him about the validity of my idea, which I believe spurred him into action of updating his widget site after nearly a year. Such is the convenience of being a motivated political activist, with a deep understanding of IT architecture capabilities. 47

152

Knowledge architecture for socio-economic policy analysis

In respect to the work Ive done, if this approach is truly needed, all I had to do is to create a basic pattern of approach and the community user will jump at the opportunity to fill the form. That is the wondrous nature of practical approaches, their ability to self-actualize due to immediate necessity - build it and they will come. I would truly be honored if the LF staff would bother to read my paper and upgrade some of the features of their amazing software with features to formalize discourse adapted from academic research practice. I also do believe I have proved that my hypothesizes 156were valid: Community enabled knowledge architecture is possible 157 Open source has proven itself pragmatically to be vastly superior to closed repositories when it comes to its ability in allowing those with the least resources, who need it the most, to access the general body of knowledge. The LF platform, through is rapid adoption of the current version represents an important step in the popularization of such efficient technologically enabled mechanisms for decision making at community level, raising both the prospect of better self-governance and the need for higher IT literacy (at community level). Additional functionality has now come within the reach of neophytes such as myself, with the propagation of modular software, that follows in the design principles of object-oriented programing158, and whose principles for efficient organization can be further transplanted at community level. As such a very useful visual module such as a timeline could be construed for implementation in relatively experimental software. Calls for enhanced visualization of complex knowledge structures such as policy have been raised for a while, but have yet to be implemented properly both at the level while some other points I was keen to make out are more self-evident, what hasnt really sunk at the academic community level is the quality of open source software available. Take for example Timeglider, a free, easy to manage timeline plug-in, vastly

Even thou I didnt formally identify them as such and there were quite a few more than is traditionally prescribed by academic research, because of the size of the research. In the respect of identifying gaps in knowledge, after reading through the body of knowledge one comes to the conclusion that these gaps are so obvious that they might as well represent truisms. In that respect the paper could hardly be considered scientific and could be construed as some kind of manifesto. However, as I point out from the start, this paper isnt concerned with showcasing gaps in the body of a selectively maintained closed type of expertize repository, but addressing the issue at a pragmatic level, as it exists and is delegitimizing the current social constructs. 157 it already exists in the form of free knowledge repositories (community of knowledge), debate platforms (forums, etc.) and, just beginning, in the form of IT facilitated consensual decision making (Liquidfeedback, adhocracy). More than that, as the evolution of such platforms for self-expression and actualization proves, it is self-organizing and in a search for further structuring . 158 Most interactional software used for visualization is built in object oriented platforms (some ubiquitous examples from the project are Javascript and PHP) 48

156

Knowledge architecture for socio-economic policy analysis

superior to most commercial159, website dependent other offers, offered by the creators against a sole collateral, the community supported social contract that the software will be employed for the common good, as to justify the vast amount of time and capacity invested by its makers. I should perhaps complain about the lack of time 160in managing such a vast concept, but truly the only complaint I have on that side, is that I didnt manage to make better use of my time. Even so the intensity with which my brain functioned in the final weeks of the project was tremendous. I never imagined that having to conceive and manage all aspects of a knowledge based research could be so rewarding and offer such insight. There are of course a few areas of knowledge, besides the project, that I would like to explore further: mathematical models for knowledge representation and knowledge space construction Schulze method for proxy voting and Game Theory practical implications semantic search models Usenet interactivity studies

As a part of this endeavor I plan to also help popularize my activities, with similarly interested individuals/institutions, participating in open source developments and research forums such as the ESRC161 for a possible PHD funding within the UK162. Economics cannot be simple numbers. It must be political economy, graph theory, algebra and semantics, etc. if we are to truly concern ourselves with relationships of production, just as from a research perspective we cant all be number crunchers as theres no more room there. A true researcher must forge his own niche, anyone else that follows him there is simply a student of the concept originator163. All in all, this is a manual on how a single individual has managed to implement a pragmatic process of policy analysis that could benefit a community, which is what I asked for in my research question.

The switch from the MIT to the limited license has also made me keenly aware of the difference between free and open) http://opensource.org/licenses/mit-license.php 160 I suppose I always knew I wouldnt end up with a finished product, but the idea wasnt me finishing the project out of yet to be properly tested software, but to use the momentum generated by my masters degree and set up the bases for a future PhD research project. I knew it was going to be hard to manage such a project, but I truly wanted to prove that it is possible and that the issues arent insurmountable. 161 http://www.esrc.ac.uk/ Economic and Social Research Council (dealing in socio-economic policy, with the motto Shaping Society) 162 while Im not sure if this type of project is suitable for a 3 year PhD, since in 3 years time, this idea will either blow up or blow over, through my involvement within the context of IT enhanced knowledge visualization I hope to generate externalities far above merely satisfying academic requirements. 163 Pragmatically speaking, the best job you can have in this world is the one you can conceptualize for yourself and that serves a real social need, as theres probably no competition and youll have first entry and standard setting privileges. 49

159

Knowledge architecture for socio-economic policy analysis

Traditional means of The open source community engaging the general First project iteration, community have been combining PhpBB forums provides a much better example of data integration forums, blogs, etc. with blog style exposure and social utility drafts The reason for this Recognizing the ineffectual ineffectuality is the highly nature of traditional social, Fig. 21 Iteration distributed nature of commercial platforms in government plutocracy dealing with complex issues spiral with heuristic Current mechanisms of change are ineffectual, serving not community interest, but lobby ones Economic environment can only be stabilized through formalization with sociopolitic intervention An uneven market makes academic advantage moot, as anyone can function in a no rules playfield Difficulty getting employed within the current accounting/financial environment Another economic immigrant arrives in England

elements
With each iteration moving closer the creating a working model of a community policy generating architecture The need to create an interface for allowing users to both easily populate the timeline and get back info Finding Timeglider, an open source timeline, created specifically to assist in community organisation The concept of community populated timeline, dictionary and other tools added to Liquidfeedback

The need to formalize policy by submitting it to complex semantic analysis, with the assistance of IT networks Discovering the Liquidfeedb political approach as opposed to academic research approach Attempting to combine intent and knowledge into a community of production Need to empower said community with necessary analysis tools at common level to avoid plutocracy

The means to provide quick access to huge amounts of data is to take advantage of visual/semantic brain

50

Knowledge architecture for socio-economic policy analysis

List of references Althaus, Catherine; Bridgman, Peter & Davis, Glyn (2007). The Australian Policy Handbook (4th Ed.). Sydney: Allen & Unwin. (last visited 8 September 2012) http://www.dpac.tas.gov.au/__data/assets/pdf_file/0008/121130/11_What_use_is_the_polic y_life_cycle.PDF Althusser, Louis (1971): Lenin and Philosophy. London: New Left Books Agre, Philip E. (2002) Real-time politics: The Internet and the political process, The Information Society: An International Journal, Volume 18, Issue 5 Bloom, B. S., Engelhart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: the classification of educational goals; Handbook I: Cognitive Domain New York, Longmans, Green, 1956 Burgin, Victor (1982): Looking at Photographs. In Burgin (Ed.), op. cit., pp. 142-153 Chandler, Daniel (1994): Semiotics for Beginners, http://www.aber.ac.uk/media/Documents/S4B/semiotic.html (last visited 6 September 2012) Ceglarek D., K. Haniewicz, W. Rutkowski (2010) Semantic Compression for Specialised Information Retrieval Systems , Advances in Intelligent Information and Database Systems, vol. 283, p. 111-121, 2010 Chen, Chaomei (2002) Visualization of Knowledge Structures, The VIVID Research Centre, Department of Information Systems and Computing, Brunel University, Uxbridge UB8 3PH, UK ftp://ftp.cs.pitt.edu/chang/handbook/59b.pdf (last visited 6 September 2012) Chomsky, Noam (2003). Hegemony or Survival. Metropolitan Books. ISBN 0-8050-7400-7 Cook, Kristin A. and James J. Thomas (Ed.) (2005). Illuminating the Path: The R&D Agenda for Visual Analytics. National Visualization and Analytics Center. p.30 Coward, Rosalind & John Ellis (1977): Language and Materialism: Developments in Semiology and the Theory of the Subject. London: Routledge & Kegan Paul Dent, M. Christopher (2008). East Asian Regionalism, Routledge, pp 27-37 Derrida, Jacques (1974). White Mythology: Metaphor in the Text of Philosophy, New Literary History 6(1): 5-74 Derrida, Jacques (1976). Of Grammatology. Baltimore, MD: Johns Hopkins Drucker, Peter. F (2002) Managing in the next society, Truman Talley Books, St. Martins Press, New York Flyvbjerg, Bent (1998) Rationality and Power: Democracy in Practice, The University of Chicago Press, 1998.
51

Knowledge architecture for socio-economic policy analysis

Gabbay, Dov and Hunter, Anthony (1993) Making inconsistency respectable: Part 2 Metalevel handling of inconsistency, Symbolic and Quantitative Approaches to Reasoning and Uncertainty, Lecture Notes in Computer Science, 1993, Volume 747/1993, 129-136 Hofstadter, Douglas (1980) Gdel, Escher, Bach: An Eternal Golden Braid. New York: Vintage Books Huber, Bethina (1999) Experts in organizations The power of expertise, Institute for Research in Administration, University of Zurich Huff, A.S. and Huff, J.O. (2001) Re-focusing the business school agenda, British Journal of Management 12, Special Issue, pp. 4954. Jonassen, D H (2006). Modeling with Technology: Mindtools for Conceptual Change. OH: Merrill/ Prentice-Hall. Kelman, Herbert C. (2001) Reflections on social and psychological theories of legitimacy, The Psychology of Legitimacy Emerging Perspectives on Ideology, Justice and Intergroup Relations, Cambridge University Press King, David (2005) Technology and Transportation: A Conversation with David Levinson, Critical Planning, A Journal of the UCLA Department of Urban Planning, (last visited 8 September 2012) http://www.spa.ucla.edu/critplan/past/volume012/04_King.pdf Kling, Rob and Courtright, Christina (2003) Group behavior and learning in electronic forums: a sociotechnical approach, The Information Society: An International Journal, Volume 19, Issue 3, 2003, pp. 221-2 Langholz Leymore, Varda (1975): Hidden Myth: Structure and Symbolism in Advertising. New York: Basic Books Lvi-Strauss, Claude (1972): Structural Anthropology. Harmondsworth: Penguin35. Liaw, M. (1998). E-mail for English as a foreign language instruction, System, 26, pp. 335-351. Liaw, Meei-ling (2006) E-learning and the development of intercultural competence, Language Learning & Technology, Vol.10, No.3, September 2006, pp. 49-64. Liquidfeedback developer notes (2012) http://dev.liquidfeedback.org/pipermail/main/ (last visited 11 September 2012) Lilly, John C (1968) Programming and Metaprogramming in the Human Biocomputer: Theory and Experiments (1st ed.). Communication Research Institute. 1968. ISBN 0-517-52757-X. Lowe, Janet (1997). Warren Buffett Speaks: Wit and Wisdom from the World's Greatest Investor. Wiley. ISBN 978-0-470-15262-1, pp. 165166. Marmura, Stephen (2008) A net advantage? The internet, grassroots activism and American Middle-Eastern policy, New Media & Society April 2008 vol. 10 no. 2, pp. 247-271. Messaris, Paul (1994): Visual 'Literacy': Image, Mind and Reality. Boulder, CO: Westview Press
52

Knowledge architecture for socio-economic policy analysis

Nichols, Bill (1981): Ideology and the Image: Social Representation in the Cinema and Other Media. Bloomington, IN: Indiana University Press Page, S (2006). "Path Dependence". Quarterly Journal of Political Science 1: 88. Paul, R. (1993). Critical thinking: What every person needs to survive in a rapidly changing world (3rd ed.). Rohnert Park, California: Sonoma State University Press Pearl, Judea (1983). Heuristics: Intelligent Search Strategies for Computer Problem Solving. New York, Addison-Wesle Peirce, Charles Sanders (1931-58): Collected Writings (8 Vols.). (Ed. Charles Hartshorne, Paul Weiss & Arthur W Burks). Cambridge, MA: Harvard University Press Radin, Beryl (2000). Beyond Machiavelli : Policy Analysis Comes of Age. Georgetown University Press. ISBN 0-87840-773-1. Saunders, Mark, Lewis, Philip and Thornhill, Adrian (2007) Research methods for business students, Fourth Edition, Prentice Hall Shannon, C. E., & Weaver, W. (1949). The mathematical theory of communication. Urbana, Illinois: University of Illinois Press St. Laurent, Andrew M. (2004) Understanding Open Source and Free Software Licensing, (last visited 14 September 2012) http://hugoroy.eu/doc/understanding_fs_licensingandrewmstlaurent-ccbynd.pdf Starkey, K and Madan, P. (2001) Bridging the relevance gap: aligning stakeholders in the future of management research, British Journal of Management 12, Special Issue, pp. 326. Taipale, Kim A, (2005) Destabilizing Terrorist Networks: Disrupting and Manipulating Information and Information Flows, Presented at: The Global Flow of Information, Yale Information Society Project, Yale Law School , April 03, 2005, http://www.informationretrieval.info/papers/infowar/CAS-YISP-040305.pdf (last accessed online 9 September 2012) Taipale, Kim A. (2006) Designing Technical Systems to Support Policy: Enterprise Architecture, Policy Appliances, and Civil Liberties, in Emergent Information Technologies and Enabling Policies for Counter Terrorism (Robert Popp and John Yen, eds., Wiley-IEEE Press, Mar. 2006 (last accessed online 9 September 2012) http://papers.ssrn.com/sol3/papers.cfm?abstract_id=712165 Taipale, Kim A. (2010) Cyber-Deterrence, in Law, Policy and Technology: Cyberterorrism, Information Warfare, and Internet Immobilization (IGI Global 2011) (ISBN 1615208313). (last accessed online 9 September 2012) http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1336045 Taleb, Nassim Nicholas (2007) The pseudo-science hurting markets, FT.com, Financial Times http://www.fooledbyrandomness.com/FT-Nobel.pdf (last visited 6 September 2012) Toffler, Alvin (1980) The Third Wave, Bantam Books, USA
53

Knowledge architecture for socio-economic policy analysis

Tushnet, Mark Victor (1995) Policy distortion and democratic debilitation: comparative illumination of the countermajoritarian difficulty, Michigan Law Review, Vol. 94, No. 2 (Nov., 1995), pp. 245-301 Zhu, Fu-qiang (2009) Is Economics a science? - A survey based on the criterion of the demarcation of science, Journal of Fujian Normal University (Philosophy and Social Sciences Edition), 2009-03. University of Cambridge Online (2012) http://www.lib.cam.ac.uk/open_access/ (last accessed 2nd of September 2012).

54

Knowledge architecture for socio-economic policy analysis

A short visual introduction to Liquid Feedback

Fig. A The initial visitor is asked to choose from a variety of option such as language and location. Units - The location bit for the demo version of LF is very important as once registered, you can only vote in the region/institution you belong to. Again, this registration is done by unique IDs (either Institution issued unique registration number or the Civic ID registration number).

55

Knowledge architecture for socio-economic policy analysis

Fig. B In a registered page, you would see your registered areas and the status of various issues, with an icon, showing if youve voted there directly (you have multiple choices, remember), or if have delegated a proxy for that policy/area (corpus of policies) and also if he has delegated further (no more than two visible chain links for now). Remember, while your proxy votes for you, you can always cast your vote in person (you are the principal) and a vote will be deducted from him.

Fig. C Latest events/issue status/your delegations allow a variety of quick browsing options that supplement the poorly implemented search function. You can see here the level of support for the most popular issues (thou technically the first one, the one that opens the discussion should be called a proposition), if they made it past the 10% barrier of interest, or if theyve been dropped out.

56

Knowledge architecture for socio-economic policy analysis

Fig. D Members an incredible feature of this software is the total transparency, manifested by the ability to view all members registered (remember with a traceable ID) and their proxy delegations. Within the Pirate Party there were a series of strong debates criticizing precisely this total transparency feature, for reasons of privacy. Also, proxy delegation has been amended as to eliminate dead members automatically after a period of inactivity. http://liquidfeedback.org/2011/02/06/aussetzen-von-stimmgewicht-bei-inaktivitaet/ (in German)

Fig. E The search function is very basic and semantically mal adjusted. So far the only thing it can pick up is the main location above for any particular initiative. Of course, it needs to be properly updated, so that it can search by semantic content, otherwise individuals will be deprived of a must have tool. Within the site that Im proposing this would be the main search function, a level below browser indexing and a level above Advanced Visualization Content Search. At the moment Im hoping the community will take care of the technical details concerning this function.

57

Knowledge architecture for socio-economic policy analysis

Fig. F Each issue has quite a comprehensive discussion page, where suggestions and alternative proposals can be made. If its a pre-vote issue, it lists the supporters, until it reaches the necessary 10% barrier and then it becomes an initiative on which people can vote in a limited period of time. Issues that dont make it past the 10% barrier get dropped out as not to split the vote too much.

Fig. G Every issue/initiative has a series of statistical figures, presented in real time, that enable users to get the best and most transparent experience out of the voting process, while allowing for statistical analysis from third parties, such as researchers.

58

Knowledge architecture for socio-economic policy analysis

Fig. H A voting process with two initiative that have passed the 10% mark as issues. The voters are all listed here (including the ones by proxy) by initiative.

Fig. I to support the initiative, it is possible to offer suggestions (thou Im unsure on how this can be done in an efficient manner for many suggestions, without a like system).

59

Knowledge architecture for socio-economic policy analysis

Fig. L Schulze method for preferential voting or Clone proof Schwartz Sequential Dropping allows for a choice of alternatives by preference order or even disapproval order. You can literally vote on every single option.

60

Knowledge architecture for socio-economic policy analysis

Traditional socio-economic policy analysis164 schools


The big picture of policy analysis comes from the field of International Political Economy and in honoring my previous teacher in the field, I will use one of his books (Dent, 2008). The current fundamental schools of thought (pretty self-explanatory from their titles) are: a. Neo-realism that assesses that the various players 165 in the field of policy are essentially power maximizers, going for competitive relative advantages, rather than the absolute advantages gained through cooperation. The reason for that is the end game possibility, where placing trust in the wrong partner, might lead to a complete loss of the political game and end the entity. This perspective includes the hegemonic stability theory166, where a dominant market player stabilizes the policy market. HST works in numerous respects, such as policy corpus definition or enforcement of a particular policy. b. Neo-liberalism, founded on individual self-determination and utility-maximizing rationality, where economic Darwinism takes primacy. In a sort of opposition to neorealism, the theory concedes that given enough money, one can overcome outside control. Policy should follow the true and fair principle, where true and fair represents the will of the legitimate political power holder, which in the case of a democracy is the individual (Bent, 1998). c. Social constructivism takes away precisely the self-determination issue, instead mentioning that we function as individuals, only in relationship to a greater group. Our individualism itself is but a social construct, situated within the great machine of society (a construct in its own right). This is the fertile ground of semiotics, where, by allowing for the splitting of data into smaller components, deconstructionism allows for policy to be split into token components and for the systems to be ordered in a scientific/algorithmic fashion, with a code that we can recycle over and over again within our architecture. d. Marxist-structuralist, as the original class struggle has found representation in the modern world, where international restrictive/protectionist plutocracies, try to preserve their privileges, by generating/enforcing policy upon the lower castes.
164 Analysis and synthesis, to take apart and to put together or in more modern terms deconstruction and reconstruction are the means by which policy is to be approached. These methods work best in a structured environment, which brings up the need to formalize policy both by identifying existing structures and by creating new ones. 165 Game theory, complex systems theory 166 Chomsky (2003)

61

Knowledge architecture for socio-economic policy analysis

Short introduction to Semiotics


Semiotics is the field in which deconstruction first appeared as a method, owing to the superior organizational potentiality of the written word, the same means by which policy is in overwhelming majority, transferred within modern society. In respect to the IT architecture, semiotics also offers us the much needed missing link/connector level between natural language policy expression and programming standardization. As Chandler (1995) puts it, semiotics has been used for a variety of reasons such as by structuralists such as Lvi-Strauss for myth, kinship rules and totemism; Lacan for the unconscious; Barthes and Greimas for the grammar of narrative in exploring a wide array of social phenomena. The he goes on to quote Julia Kristeva in that what semiotics has discovered... is that the law governing or, if one prefers, the major constraint affecting any social practice lies in the fact that it signifies; i.e. that it is articulated like a language. The study of semiotics might as well be the study of human policy, as partisan representations of language and political intent often have the same roots. He then goes on to say that while authors such as C W Morris define semiotics as the science of signs (Morris 1938, 1-2), calling it a science would be misleading, since as of yet semiotics involves no widely-agreed theoretical assumptions, models or empirical methodologies has tended to be largely theoretical, many of its theorists seeking to establish its scope and general principles. As such, the science of conceptual thought has constantly alienated academics used to more stable/tangible/rigid approaches to critical thinking. The general purpose use of semiotics has led some to erroneously label it as a science, when in fact semiotics is still a relatively loosely defined critical practice rather than a unified, fully-fledged analytical method or theory. At worst, what passes for semiotic analysis is little more than a pretentious form of literary criticism applied beyond the bounds of literature and based merely on subjective interpretation and grand assertions (Chandler, 1995). In fact, because of the reliance of this loose technique on the interpretative skill of its user, some unfortunate practitioners can do little more than state the obvious in a complex and often pretentious manner (Leiss et al. 1990, 214). Kinder voices have also spoken for semiotics praising its promise of a systematic, comprehensive and coherent study of communications phenomena as a whole, not just instances of it (Hodge & Kress, 1988, 1). This neednt have been a semiotic analysis. It could have been a political discourse, literary, historical or other kind of analysis, but the idea of a semiotic code, accessible to the end user, seemed the least biased/unengaged in respect to the presentation of truth and the ontological relationships between the constituting elements of a policy text. The classical view is that texts are homogenous across and they have only one interpretation, the one intended by the author. However, interpretation depends as much on the authors position as on the readerss and since there are no perfect readers, texts are unavoidably open to subjective interpretation. In fact as Chandler points out there are often several alternative (even contradictory) subject-positions from which a text may make sense.
62

Knowledge architecture for socio-economic policy analysis

Whilst these may sometimes be anticipated by the author, they are not necessarily built into the text itself. In fact, it is quite common for authors to describe texts as having a life of their own, behind their scope.

There is however a limited degree to which policy can be fully codified, as very specific functions will still need to be defined on a case by case basis. The large number of exceptions means that the creation of a true syntax might be very difficult. Still, based on literature review I believe it might be possible to create a core syntax and a taxonomy of semantic terms. Semantic networks 167are already used in specialized information retrieval tasks, such as plagiarism detection. Search engines also act in a similar manner to the human brain when looking for useful information, just a lot more streamlined and fast. They can also become more and more attuned to particular searches with usage, and given enough time will become capable of identifying complex semantic contexts. In my research question I make a reference to lowest common denominator, but how does that translate in code terminology? Semioticians distinguish between broadcast codes accessible to a wide audience (pop music) and narrowcast codes, specific to a specialized group (gospel music). The broadcast codes have the following characteristics in relation to the narrowcast ones, by being structurally simpler, repetitive, with a great degree of redundancy, making sure they dont get lost in the process of communication. Because of the limited amount of elements they are able to transfer, theyre sometimes called restricted codes (Fisk, 1982). Such codes several elements serve to emphasize and reinforce preferred meanings (Chandler, 1995). Again, we notice that the emphasis is on preference and not efficiency or consensual decisions, thou a consensual decision that favors the restricted meaning is likely to emerge, leading to my lowest common denominator. I myself have to wonder if I am just the object of a very sensible form of ideology or if I am the enabler of my own freedom168, genuinely engaged in creative thought, going beyond mere synthesis169. Individuals differ strikingly in their responses to the notion of media transformation. They range from those who insist that they are in total control of the media which they use to those who experience a profound sense of being shaped by the media which uses them (Chandler, 1995). He then hints to the existence of contextual cues that help us identify the
Such processes make use of other information present in a semantic analysis system and take into account the meanings of other words present in the sentence and in the rest of the text. The determination of every meaning, in substance, influences the disambiguation of the others, until a situation of maximum plausibility and coherence is reached for the sentence. All the fundamental information for the disambiguation process, that is, all the knowledge used by the system, is represented in the form of a semantic network, organized on a conceptual basis. In a structure of this type, every lexical concept coincides therefore with a semantic network node and is linked to others by specific semantic relationships in a hierarchical and hereditary structure. In this way, each concept is enriched with the characteristics and meaning of the nearby nodes. http://en.wikipedia.org/wiki/Semantic_search 168 Unfortunately, until my method of thinking becomes the broadcast code, I am unable to compare and change, leading to a false sense of security and elitism. 169 Perhaps the truth is in the middle as Nietzsche once said those who stare long enough into the abyss, often find that the abyss is staring back at them, or perhaps the truth in bigger than subjectobject and, in the spirit of Sartres being and Heideggers ontology, we are one with everything, makers and masters of our own universe. 63
167

Knowledge architecture for socio-economic policy analysis

appropriate codes as they appear quite obvious, over determined by all sorts of contextual cues. This cueing is part of the metalingual function of signs. Chandler goes on to say that The classic realist text is orchestrated to effect closure: contradictions are suppressed and the reader is encouraged to adopt a position from which everything seems obvious. Sometimes surpassing for importance content, the form can have a major impact on decision making as we routinely judge a book by its cover and through the use of what is sometimes called scholarly apparatus (such as introductions, acknowledgements, section headings, tables, diagrams, notes, references, bibliographies, appendices and indexes) we immediately identify a particular text as such (Chandler, 1995). Theres a constant danger of textual determinism (a medium induced bias), when policy is simply read as intended and not subjected to critical analysis.

Intrinsic difficulties of code transfer


Iconicity - As the most material medium for policy representation is undoubtedly the written language, policy is subjected to the bias of as true a representation as we can get, which means that without a superior forum that can decode the original representation, we cannot attempt to modify policy. Basically, it means that the natural feel of policy is so strong on the user/community, that a single user will likely just accept policy170. As Chandler (1995) states instead of drawing our attention to the gaps that always exist in representation, iconic experiences encourage us subconsciously to fill in these gaps and then to believe that there were no gaps in the first place... This is the paradox of representation171: it may deceive most when we think it works best. Interpretability is in a sense the reverse of iconicity, when we have too many meanings and means of interpretation. A photograph is worth a thousand words, but when one is trying to confer just the essence/his own interpretation of an idea, a single word will suffice. Codes are interpretive frameworks which are used by both producers and interpreters of texts (Chandler, 1995). The reason we select and combine signs in relation to the codes with which we are familiar is to limit... the range of possible meanings they are likely to generate when read by others (Turner 1992, 17). Translation - Translation from lower levels of practical theory to higher levels of ideatic one involves an inevitable loss of specificity and a generalization of the topic, which moved into a higher class might lose nuances. Again this can be seen in terms of digital/analogous. And of course, translation from the particular view of the user to the one of the maker, invariably changes content interpretation. One of the issues we encounter in policy analysis is that while the textual medium it is used in is highly symbolic, the policy code itself can be highly iconic. Historical evidence indicates a tendency of linguistic signs to evolve from indexical and iconic forms towards symbolic forms. A process of symbolization/ shared generalization, evolved from an index a 'generate in the lesser degree' and an icon a 'generate in the greater degree'. Smaller structures get arranged into more complicated symbolic ones, in an attempt to counter the uncertainty of an entropic universe172.
I presume most authorities are seldom concerned with semantics, leaving so many pieces of law open to subjective interpretation. 171 We have built a European Union and made it perfect by praise, not by constant self-actualization. 172 At least on paper, we are the masters of the universe and our own destinies 64
170

Knowledge architecture for socio-economic policy analysis

Writing is almost a digital code, with clear rules. The deliberate intention (precision/intent) to communicate tends to be dominant in digital codes, while in analogue codes (unintentional) it is almost impossible... not to communicate. Again, quoting Chandler (1995) the graded quality of analogue codes may make them rich in meaning but it also renders them somewhat impoverished in syntactical complexity or semantic precision. By contrast the discrete units of digital codes may be somewhat impoverished in meaning but capable of much greater complexity or semantic signification. As policy tends to constitute to situate itself halfway through a digital and an analogous code, interpretable, while contentiously possessing a dictionary, one must content that the move mentioned in the preceding paragraph, towards a more symbolic form is unavoidable and highly desirable173. When analyzing policy, one might have to do a taxonomy of necessity, just as I did for my concept, in an attempt to isolate the product of interest, the topic at hand, since policy is quite often extremely difficult to pinpoint down. Again, I must emphasize the need to treat policy as much as a conventional scientific code as possible, for the purposes of analysis, by identifying structural rules and placing it into an observable and comparable context 174(a more primary code, a category of inclusion if you will). There are a number of approaches to textual analysis apart from semiotics, such as rhetorical analysis, discourse analysis and content analysis. Why semiotic analysis and not content, rhetorical or discourse ones? Semiotics includes rhetorical analysis (debate) and discourse analysis (intent). Simply put semiotics is the greater class of analysis, and avoids the various types of one-dimensional bias. Content analysis is quantitative and in the absence of a clear syntax, often at great cost for the research, it was difficult to properly apply such a view to a vast and changing corpus of policies. But software offers the promise of being able to change that and within the confines of the Liquid Feedback platform I will try to evidentiate prospects of such functionality.

173 174

Even without my express desire to codify it In standardizing code a very useful tool that appears readily available might be variance analysis. 65

Knowledge architecture for socio-economic policy analysis

Educational externality
At its fundament, this whole project is an exercise in self-education and the communal education of others175. Its about teaching people how to think in a consensual environment, about developing critical abilities beyond the lowest common denominator. The many different understandings of simple terms reflect sometimes very complicated and contextual propositions actively enforced by large organized groups. This basically means that in an attempt to create a new organizational paradigm one must be vary of various connotations and try to reconstruct language in a more socially neutral/friendly form.

Teaching people to read policy


Creating my own code, naming my own concepts through early life graphic design has started me on the path to an existential self-determination which isnt readily accessible to most individuals. As we live in a world where most people in most societies are confined to the role of spectator of other people's productions176 (Messaris 1994, 121). It is important that we recognize this form of personal empowerment, which enables us to act as social agents. Chandler (1995) tackles the issue of a semantic system, which pressures people into code conformity, from starting with an over emphasis on symbolic codes (text, sciences) over iconic codes (such as design) during formative education. He goes on to say that this conformity translates at the level of their entire lives and that this institutional bias disempowers people not only by excluding many from engaging in those representational practices which are not purely linguistic, but by handicapping them as critical readers of the majority of texts to which they are routinely exposed throughout their lives. A working understanding of key concepts in semiotics - including their practical application - can be seen as essential for everyone who wants to understand the complex and dynamic communication ecologies within which we live. Acceptance of codes from a social perspective: In a simplistic manner, a code can be hegemonic and its acceptance full and implicit It can be subject to debate and improvement, as a contentious issue It can encounter a full rejection, as too dissociated from the current social context.

It is extremely important that we eliminate natural semiotic lethargy and we recognize that we are part of a prearranged semiotical world where, from the cradle to the grave, we are encouraged by the shape of our environment to engage with the world of signifiers in particular ways (Lewis 1991, 30). While we arent prisoners of semiotic systems, we are shaped by them throughout our lives. This is a much more fundamental change than
http://upload.wikimedia.org/wikipedia/commons/2/24/Blooms_rose.svg I would argue that the inability of mastering our own inner universes is a sign of enforced immaturity, and one of the issues associated with social manipulation by various institutions interested in the status quo, through such selectors as the psychometric tests, etc.
176 175

66

Knowledge architecture for socio-economic policy analysis

merely seeing under policy layers, as it forces us to reevaluate ourselves as part of that policy, starting with our role as readers. This is important because many individuals feel like observers and are politically inactive and a self-actualization as an agent of change can spur them into action. Of course, we should also aim to recreate the semiotic comfort with respect to community consensus, which does mean establishing new user friendly processes of analysis, even opening the design of such processes to community design177 to replace the broken traditions. As Chandler (1995) notes, realities are not limitless and unique to the individual as extreme subjectivists would argue; rather, they are the product of social definitions and as such far from equal in status. Realities are contested, and textual representations are thus sites of struggle. Discussing policy therefore is not just fair game in social Darwinism, but also a natural function of the thinking individual. Semiotics is an invaluable tool for looking beyond not just appearances, but fundamentally accepted values178, as the more obvious the structural organization of a text or code may seem to be, the more difficult it may be to see beyond such surface features (Langholz Leymore 1975, 9). Semiotics can also help us to realize that whatever assertions seem to us to be obvious, natural, universal, given, permanent and incontrovertible are generated by the ways in which sign systems operate in our discourse communities (Chandler, 1995). We have to see the code behind the concept, less it degenerates into a system of interpretative hermeneutics, with the reality of inner processes hidden from us. The problem with policy is that quite often it is infused with ideology, which is instead of what the ground reality should be, policy reflects what we think there should be, often idealistically. The current system of trial and error has unfortunately lowered itself to a blind mans social engineering on a very large scale. An ideology is the sum of taken-for-granted realities of everyday life (Burgin 1982, 46) Because signs both refer and infer their content, they are often purveyors of ideology. Sign systems help to naturalize and reinforce particular framings of the way things are, although the operation of ideology in signifying practices is typically masked... If signs do not merely reflect reality but are involved in its construction then those who control the sign systems control the construction of reality. However, commonsense involves incoherences, ambiguities, inconsistencies, contradictions, omissions, gaps and silences which offer leverage points for potential social change. The role of ideology is to suppress these in the interests of dominant groups. Consequently, reality construction occurs on 'sites of struggle (Chandler, 1995). How does ideology work from the point of view of semiotics? Apparently, the ideological code activates individuals predisposed to this type of interpellation. While classical liberal view tends to see man as an individual whose social determination results from their pre-given essences like talented, efficient, lazy, profligate, etc. (Coward & Ellis, 1977), the structuralist view sees him as a time built construct from various outside codes. Because of this

As in the case of larger software suits, such as Debian, where users decide the control structure. And if people can come together to create a software that cost about 8 billion USD, and saved countless billions, why shouldnt they be able to do the same at creating a policy corpus, when the stakes are much greater. 178 A short ontological rant from a purely deconstructivist perspective 67

177

Knowledge architecture for socio-economic policy analysis

preexisting condition, while people might be able to resist messages with which they disagree, with codes that match their own they often find that resistance is futile. Seeing the point simultaneously installs us in a place of knowledge and slips us into place as subject to this meaning (Nichols 1981, 38). Recognition of the familiar (in the guise of the natural) repeatedly confirms our conventional ways of seeing and thus reinforces our sense of self whilst at the same time invisibly contributing to its construction The familiarity of the codes in realist texts leads us to routinely suspend our disbelief in the form179 Falling into place in a realist text is a pleasurable experience which few would wish to disrupt with reflective analysis (which would throw the security of our sense of self into question). Thus we freely submit to the ideological processes which construct our sense of ourselves as freethinking individuals (Chandler, 1995). I wonder how deep we need to go, in our pursuit of better policy. Will we end up cutting in the very nature of our society? Many semioticians see their primary task as being to denaturalize signs, texts and codes. Semiotics can thus show ideology at work and demonstrate that reality can be challenged It can be liberating to become aware of whose view of reality is being privileged in the process (Chandler, 1995). Yet the code that Im proposing doesnt operate in a political vacuum, as it rides on a social code of the acceptance and necessity of change in the aftermath of a crisis and attempts to engage existing social institutions 180at the subtle level of a semiotic understanding of policy, in an attempt to spur them into action.

Deconstructing the concept/project and teaching individuals to see it


The project conceptual/building process and its ultimate functionality are intimately and inexorably linked, therefore just as I aim for a website where policy is to be deconstructed, I must start by deconstructing my very own textual paradigm. It would be ludicrous inviting people on a platform which prides itself on transparency and willingly leave the inner mechanisms of conception and intent hidden. We favor solutions and not problems, so an instance of an academic paper that leaves more issues unexplained that done, is highly unsettling (Chandler, 1995). Unfortunately, since this is a conceptual paper in essence, I am forced to explain through theory not just the mechanisms of the conceptual process, but also the biases by which the mind can eschew itself from the process of understanding and thinking in a critical manner and, simply by falling back on familiar frameworks, asses my work as to be insufficient. Every new conceptual code/process needs teaching on how to see it and argumentation serves just that purpose. As an analogy, the early photographic experience should suffice, when people had to learn how to look at photographs, as they were not used to stills in a dynamic environment. We take established concepts for granted and forget that

179 180

And I would argue that the reverse is also true, as unfamiliarity draws out our suspicions From the general public to the government. 68

Knowledge architecture for socio-economic policy analysis

our brain is wired in such a manner that it either becomes blind to the foreign181 or transforms the unknown into something more familiar. The most relevant feedback I got throughout conception of my idea was I dont understand what youre trying to say or oh, it is a blog/forum/etc. Lack of understanding and structure is the reason why the best solutions never come into discussion. Lets take the example of democracy, a brilliant idea, thats been around for two thousand years before it became mainstream. Coming back to the more specific mass adoption of better decision models, maybe people need to be forced by systemic design to adopt a rational agent path, through the implementation of highly engaging software systems such as the one I will be detailing later on. Its not hard to imagine how an Excel or a HTML literate can have a higher political impact, than a non-literate, as they are useful skills often associated with a pragmatic justification182. Even this comprehension will hinge on the level of experience of the prospective user, as individuals who currently activate on e-forums might have an edge at comprehending the scope of the communicational process Im trying to frame from a practical instance183. I am by no means looking for ideal readers, but as this paper also constitutes itself as something of a PR manifesto for open source software, being a part of that particular target audience might greatly enhance understanding. I do tend to see the world in terms of personal equivalency and it might be possible that I demand too much out of the reader, through an exhaustive comprehension of the subject, its origins and externalities. I guess you could say that the perfect reader, the one with a complete grasp on this idea is my Self alone184 .

Making a paper out of good intentions?


Academic papers can vary greatly in their scope and as I worked on this one I came across a bit of history related to the authors I quoted here. For example, Saussure, the father of linguistics, wrote a doctoral thesis on the genitive absolute in Sanskrit (Chandler, 1995), finding a tome of significance in a single grammar rule of a dead language, while I had to struggle to find support in English for my academic paper and had to use rather generic semiotic/textual analysis frameworks185. As I went about my paper I preferred to imagine myself closer to Linus Torvaldss and his MSc thesis called Linux: A Portable Operating
Theres anecdotal evidence which suggests that when the first Europeans arrived in the Americas, the locals couldnt see them or their ships because of their unfamiliarity. 182 In the same manner people who can read and write would have over people who cant. Which makes me ask, why should we not, within the chaotic field of the current political arena, not reward the most literate, useful member of society, over the gross opportunism of the common politician, with the ability to decide at community level? 183 and consequently be more sympathetic to the difficulties encountered in analyzing/synthetizing what appear to be very disparaged domains of knowledge. 184 You could say that, but as the conceptual paradigm is slowly transforming, even while I write, that might prove to be a false assertion also. 185 Then again that was a hundred years ago, today the norm seems to be my 36 yr. old housemate who got a funded masters through ESRC to do writing, and for his dissertation has to create a fiction story. He enjoys playing squash, watching TV, etc. in his free time, which is pretty much all he has to do. 69
181

Knowledge architecture for socio-economic policy analysis

System., where the author had to envision a series of complex relationships serving an array of public purposes. The academic backgrounds for the project are rather indisputable, as in an attempt to mix several fields of knowledge, I had to rely on the true and tried fundamentals. For example on my semiotic analysis of policy, I tend to use quite a bit of Chandlers introductory186 Semiotics for Beginners, an exceptionally well written paper, aimed towards semiotics neophytes such as myself, which I can only quote in many instances, as the theories within are at the core of their field. On the other side of the spectrum, I tend to use unusual/rather novel sources of information, such as the extremely pragmatic Liquid Feedback Developer Notes 187 and Timegliders project notes 188 , for cues and points of insightful analysis into the platforms. I must confess that I came across the unifying Knowledge Architecture treaty by Taipale (2006) rather late in concept development, after initially having passed it. What I hadnt realized in the initial read of the abstract is that it validated my work not through alignment, but through a critique of its original design.

Issues associated with exploratory research


There are numerous potential issues associated with this type of research design. I have already mentioned over complexity, arising not only from the very large number of topic elements, but also from the myriad of possible interaction points. If left unchecked this could lead to analysis paralysis where one devotes a disproportionate effort to the analysis phase of a project. Another problem is that by devoting insufficient space to such a vast array of topics, I could end up with a diluted academic presentation189, resembling a collage of trans-domain topics. Because the theory behind such a cross-domain project is not set on stone, it does have a terrible tendency to divagate of topic (some of the diagrams Ive created for identifying the project taxonomy showed me just how much). In fact, I had to resort to utilizing quite a strict basic structure of presentation based on topics/tags sorted in order of importance. Luckily, the project itself has proved to be the unifying core, as its ultimate functionality required a very subjective perspective on most topics. The opposite of a diluted work would be a hermetic one190, where in exploring such a vast array of topics and attempting to present them into an accessible language, one would forget it took nearly a year of personal research and definitions learning to truly comprehend
When I say introductory, I dont mean in any case simplistic or easy. It took me nearly two weeks to make sense of the underlying epistemology and be able to integrate it with my take on policy analysis. As I will mention later on, Semiotics is a rather complicated field and theres a long and arduous walk from there to satisfying my research question through IT platforms. 187 http://dev.liquidfeedback.org/pipermail/announce/ 188 Included with the Github download https://github.com/timeglider/jquery_widget/ 189 Knowing nothing about everything 190 Knowing everything about nothing 70
186

Knowledge architecture for socio-economic policy analysis

the scope of such a problem solving approach. The irony is that while some of this vast theory will appear to be highly generic (I would describe it as fundamentalistic), in mixing it in the project itself, I will be unable to transfer efficiently my vision 191 and its real world legitimacy. Again, what I must ask of the reader is to keep in mind that this is a highly conceptual problem solving approach (new idea), to a pragmatic issue. The solutions that I have adopted do in fact represent some of the best practices in their respective fields, despite of my inability to know beforehand if their applicability would be transferable to a cross domain approach. Again, this isnt an off the shelf academic design, but a custom one. This module follows the Research Module, which was in itself a call for fundamentalist research. The only difference is in the size and scope of this initiative. While the original Research module was intended as a practice for the budding researcher, for a more experienced individual, fundamentalist research can take a whole level of magnitude. That is why, after the months spent in formulating a research proposal, we are now in the process of enhancing that proposal, of defining it and formulating longer term goals. It is if you will an extension of the original module, but instead of constituting itself as a two stage research (proposal, dissertation), it constitutes itself as a larger paper (research, initial project, masters dissertation with primary qualitative data generated through observation, PhD proposal, full site implementation, exploration of software options and political alternatives, advanced literature research and convention attendance, field networking, which Ive already started, generation of quantitative primary data and PhD dissertation with hopefully final conclusions). While the ultimate research is a lesson in network construction, influenced by Deconstructivism, the current heuristic attempt to gain insight is much more fundamental, in that as a positivist approach it aims to generate a working interpretational framework for the bigger project. In itself it doesnt offer all the answers, but it does depict a picture of things to come.

191

Rather self-defeating for a project that includes visualization techniques 71

Knowledge architecture for socio-economic policy analysis

Political implications of the project architecture


As Chandler (1995) yet again puts it in a world of increasingly visual signs, we learn that even the most realistic signs are not what they appear to be. By making more explicit the codes by which signs are interpreted we may perform the valuable semiotic function of denaturalizing the latter. In defining realities signs serve ideological functions. Deconstructing and contesting the realities of signs can reveal whose realities are privileged and whose are suppressed. The study of signs is the study of the construction and maintenance of reality. To decline such a study is to leave to others the control of the world of meanings which we inhabit. It is by means of his virtual persona that we legitimize the user into action, both by teaching him the means of policy analysis and community work and by facilitating his transfer into his new critical role, the move from spectator to author of his own reality. Because designing architecture for policy generation is a rather contentious issue I feel obligated to address the political implications of any such project. The immediate externality here would be that this IT architecture seeks to replace established policy generation methods. Nothing could be further from the truth. The creators of the Liquid Feedback platform, which sits at the core of the architecture, have acknowledged that theres a limited pool of individuals willing to invest their time and resources in the process of policy generation and most likely they are already activating within an organization such as a party, union, think tank, etc. They seek, by activating social code levers, to make these individuals adhere to an IT supported system of consensual decision making and get them to promote it within their mother organizations, therefore disseminating the platform (or its concept) within the right environment. We must also remember that my initial idea, of improving the policy generation model, is not by any means remarkable, in the sense that there are a wide number of social agents are attempting to do just this. The Liquid Feedback platform was commissioned by the Pirate Party of Germany192as a means of adding open source consensual decision making to a crowd well versed in IT usage. The main problem with this entity is that is a single issue party193 that appeared as a reaction to a perceived right to information infringement and will likely disappear, as soon as the issue is settled. What it offers however, is the image of a heterogeneous group, finding common ground and individual voice through an IT technology.
http://www.slyck.com/story819_Slyck_Interviews_The_Pirate_Bay As a part of the Pirate Parties that mushroomed all over Europe, as starting from the Pirate Bay peer to peer sharing system choosing to react to infringement lobby pressure. We can see from this 2005 interview that the website had a political agenda quite early on. 193 From a game theory perspective is extremely interesting to see how a singular agenda has moved from simply countering copy write infringement to supporting civil liberties and, now, through papers such as mine to generating socio-economic policy, which in my view proposes a much more balanced agenda, as it takes into account costs associated with these civil liberties. 72
192

Knowledge architecture for socio-economic policy analysis

The first thing people seem to notice about the project is the political externality. While proxy voting has been around for a while, it has been subjected to the same pitfalls as direct voting, that is the semi-permanent delegation of power. To tell the truth my original idea for the website included a permanent system for proxy delegation, through the use of reputation such as in ecommerce models. However, I was delighted to see how Liquidemocracy has found in proxy voting the simple and most direct solution to complex group representation. Thomas E. Mann and Norman J. Ornstein write, "In a large and fragmented institution in which every member has five or six places to be at any given moment, proxy voting is a necessary evil". Liquidfeedback is ostensibly aimed primarily at political parties as opposed to civic participation, as they hold both political agendas and the highest concentration of highly politicized individuals; the intention was to help strengthen inner party democracy and make parties more attractive to individuals (Nitsche, 2012), but the design model would literally be just a degenerate form of opinion expressing platform without the legitimacy of implementing its decisions (attained through the laborious process of utilizing the unique decision making features of LF) at community level. Make no mistake, you are not obligated to vote by proxy and you may in fact cast your vote directly, but given the large number of policy topics debated (in the hundreds on the GPP site) you might not be willing to spend that much time. Instead, as topics get covered by particular umbrella terms (such as education, military, etc. going into further details of hierarchy) you might choose to have a number of representatives voting on subjects, usually the most highly involved individuals on particular subjects and those that have either done their research or have in fact highlighted the issue in the first place, as long as your positions on the issue converge. Will proxy voting promote radicalization and extreme behavior? It is very likely that as policy modification become faster and much more accessible, some radical behaviors will find place within policy making. For example, it is likely that communities will attempt to raise barriers to access under the guise of self-protection. However, because of the nature of proxy voting, that will either expose the wide support for extreme policies within the community or show them as radicalized environments or will shame others from the same community into change and political activism. Democratic reality VS Ideal democracy VS IT democracy There is a different type of persuasive discourse associated with each highly politicized type of event. For example, online petitions have shown themselves to act a lot faster than entire monolithic agendas, but have proven to be not representative, as they express single points of view, and act in a predatorily manner in the political game, creating short term unsustainable effects. Monolithic agendas (timeline expressed) tend to squeeze stuff together, creating a better equilibrium point for policy from bias. Finally, complex agendas have proven themselves to be the most comprehensive when it comes to coverage, but extremely slow in forming. The type of architecture I will be describing here attempts to combine the speed efficiency of petitions and the comprehensive nature of complex agendas, by harnessing crowd interest in self determination to accelerate and monitor the process.

73

Knowledge architecture for socio-economic policy analysis

Not only that, the public debate will allow for transparency and participation and will lead to significant externalities in the long run, as journalist Lawrence Lessig (2005) 194puts it "It's not just that code is non-rival; it's that code in particular, and (at least some) knowledge in general, is, as Weber calls it, 'anti-rival'. I am not only not harmed when you share an anti-rival good: I benefit." That being said we must consider another point, the digital divide, where the IT literate and illiterates are becoming more and more uneven in the amount of political power they wield. Any such initiative that declares itself as attempting to raise the level of the political discourse and then generate a course of action on that plateau, is implicitly favoring those with greater policy analysis potential and possibly leaving the most destitute of intellect unable to articulate themselves just as before, gagged in a debate race to the top. The process itself is as important as the general knowledge of IT and its highly unlikely that anyone that cant understand an e-forum debate will have the ability to be active in discussions, never mind completely IT challenged people, the beggars of the new millennium. In designing this system, one must be as user friendly as possible as to avoid walking the path of outright eugenics, as in the rush to the knowledge society195, we dont abandon valuable organizational capacity. That is why this project and the ideas and technologies it promotes offer themselves as a possible support structure to the traditional democratic process (as there are no silver bullets and it would be a shame that after Arbeit Macht Frei196, knowledge will set you free should become a cruel/meaningless slogan197) aiming to draw interest from the existing policy decision structures198. Taipales policy view, with the establishment of a government/corporate network of policy enforcement had his professional active support. Security=freedom. 199His technical models are good, but the way he designs the outcome is horrendous. I chose him as the
http://www.lrb.co.uk/v27/n16/lawrence-lessig/do-you-floss the silent inheritance of pre-world war II governments (including winners in of the war and cultural expression that followed), where the state is seen as being morally tasked with advancing society, ahead of the individual or communities. 196 work will set you free, famously written on the gates of Auschwitz. There has always been a tradition of eugenics within the economic elites of Western countries such as the UK, where individuals such as John Maynard Keynes who served as Director of the British Eugenics Society, argued that eugenics was "the most important, significant and, I would add, genuine branch of sociology which exists". 197 As the USs newly established Cybercrime/Ciberwarfare divisions seem to have recently inspired similar initiatives within the range of authoritarian regimes, just as its earlier eugenics attempts, where quoted at the Hague Tribunal for War Crimes in the after math of WW2 as the inspirational model of the Holocaust. As we all know power corrupts and cyberwarfare platforms have capabilities that make the Nixon scandal look like a drop in the ocean, especially when hunting down foreign citizens such as Richard ODwyers extradiction http://www.change.org/petitions/ukhomeoffice-stop-the-extradition-ofrichard-o-dwyer-to-the-usa-saverichard 198 For example Liquid Feedback is currently being implemented as a civic voting platform within the Friesland region of Germany, at the request of the local authorities. http://www.spiegel.de/politik/deutschland/landkreis-friesland-fuehrt-liquid-feedback-ein-a843873.html 199 An article written by a another Romanian, about security and freedom. We have lived it all, we have seen it all.
195 194

74

Knowledge architecture for socio-economic policy analysis

academic representational aspect of hegemonic policy enforcement. How can the community, the purveyor of true democracy, compete with a system with such educated allies - highly trained IT and policy literate lawyers, who eventually become judges, who may enforce constitutions? In some respects network science is a battlefield200 .

Accountability of the voting process through transparency


The makers of LD believe Democracy needs trust and in order to earn this trust, democratic decision making using the internet needs to be transparent, therefore the project is open source and free, allowing anyone to inspect the code and modify it to their consensual decision making needs (Nitsche, 2012). Not only do the developers offer continuous, if limited, support to others implementing their free platform201, but their strong believe in the ideology it represents has spurred them collaborate with the Interaktive Demokratie association to promote the use of electronic media for democratic processes. One can infer by the original languages of launch that the platform sees itself with an international reach (English, German and Esperanto). The new planned languages hint at LD presenting itself as a cure for heavily damaged/marginalized democracies (Greek and Hungarian) and interest has risen in situation in which theres a perception that the classical approach to democracy is unfeasible (Dutch, Italian, Portuguese and Russian from third party support). Theres however a dark lining to this silver software cloud in the fact that the creating team is still a private based entity, that reserves discretionary powers in developing the core software, despite offering it for community development. By that I mean that at the present moment, the company will in fact keep an eye for its own financial interests and I have noticed an offer of supporting software implementation for pay. That is no surprise when one realizes that the software is hardly administrator friendly.202 This control focus from the design point will undoubtedly have ramifications in its functionality. By accessing pockets of political will and creating multiple communities, the LD platform literally tries to seed itself in as many movements as possible to avoid the risk of rejection. As they themselves mention, at the political level only the Pirate Party is able to use it for decision making and while other parties have instances of the software running, they are mostly interested in observing its possible role as a game changer.
In 2006, the U.S. Army and the United Kingdom (UK) formed the Network and Information Science International Technology Alliance, a collaborative partnership among the Army Research Laboratory, UK Ministry of Defense and a consortium of industries and universities in the U.S. and UK. The goal of the alliance is to perform basic research in support of Network- Centric Operations across the needs of both nations. In 2009, the U.S. Army formed the Network Science CTA, a collaborative research alliance among the Army Research Laboratory, CERDEC, and a consortium of about 30 industrial R&D labs and universities in the U.S. The goal of the alliance is to develop a deep understanding of the underlying commonalities among intertwined social/cognitive, information, and communications networks, and as a result improve our ability to analyze, predict, design, and influence complex systems interweaving many kinds of networks. (Wikipedia) their effect is corroborated by Taipale (2010) recently, network analysis (and its close cousin traffic analysis) has gained a significant use in military intelligence, for uncovering insurgent networks of both hierarchical and leaderless nature. Militaristic hegemonic dominance 201 MIT open source software license 202 Unlike established platforms, it doesnt offer an install wizard, which means it requires programming knowledge for instance creation and setup, and worst of all for moderator functions. 75
200

Knowledge architecture for socio-economic policy analysis

Anyway, the decision making software is unlikely to make headway in the current political climate as it clearly represents a line of cleave between IT able and unable individuals and a challenge to the power of the established political groups representing the latter. Populist political parties 203have been shown to actively block IT initiatives (by throwing doubt into the process, an easy fact when you have a majority of IT illiterate population and an incomplete software solution, such as the recent case of Mongolia) and forcefully choose to rely on direct voting as it allows for the direct manipulation of the lowest common denominator (since it is so highly interpretable and can mean so many things, to each his own, baiting people in a front of idealistic optimism and less realistic expectation). It just might be better to wait until such software becomes legitimized through mainstream usage, which will enable its usage as a civic representation platform, through public support. LiquidFeedback was not intended for civic participation in the first place because we saw several challenges. You need to define what happens with the results before you start with the participation. If the results are meant to express the opinion the citizens of a city or county, there has to be an agreement of the citizens to use the system or a legal foundation justifying the use and every citizen must be entitled to access this system with exactly one account. In most cases the access control for civic participation will be different from organizations with a member data base and has to be kept up to date. You need to decide if and how parliament initiatives shall appear in the system and by which rules they are governed (e.g. you may want to sync the timing of certain initiatives in LiquidFeedback to the political processes) (Nitsche, 2012).

Using forensic policy analysis


It is precisely this situation of how we constitute knowledge about the past that directly aects the nature of the meaning we impose upon it (Munslow, 1997). As we can deduce from this statement, the historical study of policy has in fact come to a more fundamental understanding of policy analysis, stating that the manner of investigation (input) is fundamental in policy generation (output). No doubt that the comprehensive retrospective vision and lack of pragmatic bias from day-to-day forces helps the creation of an objective image, which we could in respects compare to modern perspectives on our day to day lives. It also raises another important point, the study of historic policy can/will be pursued indefinitely for as long as its necessary in breaking down assumed knowledge models about our social roles and submitting them to deconstruction and we still possess the period repositories204. As an unexpected externality, in a continuous loop mechanism, this will undoubtedly allow us to view historic events, through the mechanism of policy analysis, a far more complete view than material artifacts. I do in fact intent to propose such a forensic policy analysis approach through the LF platform model, both for its value in understanding current policy generation (those that forget history are doomed to repeat it) and for its intrinsic value of understanding past societies.

Moderate conservatives, not some radical group Just as some old proprietary formats cannot be read nowadays because of the lack of software keys, so historical policy might not be understood in the future because of the disappearance of context keys to understand it.
204

203

76

You might also like