How to come to terms with Leon Brillouin’s theorem that no information can be acquired without paying a “price,” how to come to terms with the negentropic nature of information?
Guest Lecture Course: from form to spectrum – the computational object in design and in philosophy
The notion of »the object« reveals itself currently in a novel manner that not only produces powerful pragmatic tactics and protocols (e.g. parametrics, agent-based modeling) as well as the aesthetics of a »geometry of the colossal« (Peter Sloterdijk), but it also enriches and complicates the philosophical spectrum of how possibilities and necessities can be reasoned, established, augmented, preserved or exploited. The notion of »the object« occupies a central position in a younger generation of contemporary philosophical thinkers, who seek to situate the role of speculation and imagination in reasoning from within the relation »the object« maintains to its »computability«.
This lecture class introduces to some of the positions in these emerging discourses. We will trace a few invariant themes that currently resurface, and we will examine how they do so in novel and interesting manner. Thereby, we will pursue the following vector of interest: Are there novel aspects and interesting framings to be found in these discourses for developing a »speculative architectonics«? And how would such an architectonics relate to the double-interest formulated by a »verallgemeinerte Baukultur« (Expanded Culture of Building), namely to expand architecture beyond its disciplinary confines on the one hand, while attending to its local situatedness on the other?
COURSE MATERIALS // The positions that will be discussed include Michel Serres’ philosophy of what he calls “the transcendental objective”, as well as a broad range of thinkers and writers that group around labels like speculative realisms, new materialisms, accelerationalism, object-oriented-design, flat ontology, speculative poetics. Much of this discourse takes place online, so there are plenty of podcasts as well as texts to connect the students directly with »the sources«. Material for weekly preparation will be suggested and it is highly recommended to work through them. However, it is not mandatory for the participation in this course and it will not be part of the final exam.
FINAL EXAM // In the last meeting, the students are asked to write a short essay (1-2 pages), in which they portray and discuss one particular line of thinking from what has been discussed throughout the semester (of their own choice).
pdf of the book “The Speculative Turn” is available here at re-press
pdf of the book “Parasite” is available here: : The Parasite
ADDITIONAL READINGS
Jean-Luc Nancy, “Myth, Interrupted” in The Inoperative Community (1986) Jean-Luc-Nancy-Myth-Interrupted-2
pdf of Ludger Hovestadt, “„A Fantastic Genealogy of the Printable” in: V.Bühlmann, L.Hovestadt, Printed Physics, Metalithikum I, Springer 2012. PRINTED PHYSICS_Hovestadt
SCHEDULE BY WEEKS (weekly materials to prepare):
OCTOBER 14 // Theory of the Quasi-Object (Michel Serres, from: Parasite 1980, p. 224-34).
OCTOBER 28 // Potentiality and Virtuality (Quentin Meillassoux, from: The Speculative Turn, p. 224-36).
NOVEMBER 4 // Reflections on Etienne Souriau’s Les différents modes d’existence (Bruno Latour, from: The Speculative Turn, p. 304-33).
NOVEMBER 11 // Drafting the Inhuman: Conjectures on Capitalism and Organic Necrocracy (Reza Negarestani, from: The Speculative Turn, p. 182-201).
NOVEMBER 18 // The Ontic Principle: Outline of an Object-Oriented Ontology (Levi R. Bryant, from: The Speculative Turn, p. 261-278).
NOVEMBER 25 // The Generic as Predicate and Constant: Non-Philosophy and Materialism (François Laruelle, from: The Speculative Turn, p. 237-260).
DECEMBER 2 // MANIFESTO for an Accelerationist Politics (Alex Williams and Nick Srnicek) and MANIFESTO Xenofeminism. For a Politics of Alienation (laboria kubonics)
FINAL PAPERS due to January 15th (extended)
1-2 pages, on one of the texts discussed throughout the semester. State how you make sense of it.
please email to: buehlmann@arch.ethz.ch and add „Bestimmende Diskurse TU Wien“ in the Betreff line.
Guest Seminar: Crystalline – Argumentation and Computational Objects
Reasoning, argumentation, and »the object« maintain a multiplicitous and unsettled relation within the paradigm of computational modeling: computational models are equally reflective as they are projective, equally analytical as they are synthetic. How to approach this situation? French philosopher Michel Serres has suggested »a communicational apriori« to representation and modeling. In this seminar we will get familiar with Serres’ philosophy of what he calls »the transcendental objective«, and we will explore what it might mean to »argue« and to »reason« with concepts that are, as he suggests, to be conceived of as spectrums rather than forms. We will attempt to extrapolate what this shift in perspective (from form to spectrum) implies for terms central to reasoning like object, type, category, class, generalization, abstraction, scheme, diagram and we will try will ask what might be gained from these considerations for clarifying what is at stake in terms like product, article, one-of-a-kind, generic, etc.
In complementation to this abstract and formal level, we will look at key moments in the history of architectural theory where the architectural object has been addressed in novel manners. We will do so with the speculative assumption that each of those transformations has introduced a novel »apriori« for reasoning in architecture. As preliminary and tentative suggestions e.g.: Vitruv and the »negotiation apriori«, Alberti and the »representation apriori«, Corbusier and the »mobility apriori«.
We will try to identify diverse instances of »computational objects« on various scales in contemporary architecture, and compile them in a lexicon. Every student will choose one such »object« and explore in short presentations to the class, with several iterations over the semester, how it might be addressed best – such that a »reasonable« »argument« can be built around it. The elaboration of a suitable format for such an argument is an overall objective of this course.
FINAL EXAM // As the final work with which students conclude and earn credits for this seminar, they will build a story of their “Argument” around a computational object of their choice.
COURSE MATERIALS // A selection of articles, book excerpts and online lecture will be provided. The weekly preparation of about 20-30 pages will be expected for the participation in the seminar. The lecture class “The computational object in design and in philosophy” will be largely complementary to this seminar; participation in the lecture course is highly recommended, yet not mandatory.
Michel Serres, The Parasite (2007 [1980]) The Parasite
Jean-Luc Nancy, “Myth, Interrupted” (The Inoperative Community, 1986) Jean-Luc-Nancy-Myth-Interrupted-2
Assignment for Nov 10: see doc here: Assignment for Nov 10
GUESTS // There will be several guest lectures from post-graduate researchers associated with the applied virtuality theory-lab at the CAAD Chair, Institute for Information Technology in Architecture ITA, ETH Zurich. They will present the “computational objects” at stake in a diverse range of architectural research fields. http://www.appliedvirtualitylab.wordpress.com // www.caad.arch.ethz.ch
PhD Colloquy Summer 2015 // Elements Axioms Cryptography
The Summer 2015 PHD colloquium just ended, where we read Zalameo’s book together with Robert Blanché’s “L’Axiomatique” (read the French version, as the English translation just omitted the two crucial chapters on the implications of his discussion for science and philosophy at large!) and the latest work by Elias Zafiris on a Geometry of Spectra and the role of cryptography at work in algebra.
A great review and a further putting-in-context-and-outlook of the key themes and intents in Fernando Zalamea’s book “Synthetic philosophy, contemporary mathematics”. By Giuseppe Longo: http://www.di.ens.fr/users/longo/files/PhilosophyAndCognition/Review-Zalamea-Grothendieck.pdf
Guest Seminar/Talk with Anke Hennig: Literary communication – is there such a thing as literary information?
Anke Hennig
Central Saint Martins, University of the Arts, London
April 16 2015 2 pm
CAAD ETH Zürich
Building HPZ
John von Neumann Weg 9
Floor F
8093 Zürich
Literary Communication: Is there such a thing as literary information?
A contemporary example of the interaction between speculation and poetry is provided by an apparatus built by Peter Dittmer in 2006. The apparatus contains the linguistic codes for dialogue and learns by conversing with people; it is called the Amme (German for ‘wet nurse’) in reference to a procedure whereby the dialogues end either when the participant so chooses or when the Amme tips over a glass of milk.
In my talk I would like to detail the model of literary communication exemplified in the Amme. I will draw on structuralist (Roman Jakobson) and semiotic (Umberto Eco) models and will demonstrate why those models are doomed to fail that use only the message as their basic unit of communication. Instead both the Amme and poetic speculation rely on the grammar of language, taking failure as their starting points.
I would also like to sketch out an alternative tradition of literary interaction, one exemplified by Fedor Dostoevsky’s poetic ontology (Valeri Podoroga) and the speculative poetics of Mallarme’s Un coup de dés jamais n’abolira le hasard as deciphered by Quentin Meillassoux.
Anke Hennig teaches at Central Saint Martins, University of the Arts, London. Her research interests lie in the poetics of Russian Formalism, the politics of Russian avant-garde media, the aesthetics of totalitarianism and in contemporary theory. In addition to numerous articles, she has also edited an anthology of Russian avant-garde texts (Über die Dinge, 2010). Her recent publications have addressed the chronotopology of cinematic fiction, the present-tense novel, and speculative poetics. She is the author of Sowjetische Kinodramaturgie (2010) and, in cooperation with Armen Avanessian, co-author of Poetika nastoiashego vremeni (2014, Russian translation of Präsens. Poetik eines Tempus, Zurich, 2012) and of Metanoia. Speculative Ontology of Language (Metanoia. Spekulative Ontologie der Sprache, Berlin, 2014).
A Quantum City
Forthcoming from Birkhäuser Vienna, applied virtuality book series Vol. 6 (Spring 2015).
By Ludger Hovestadt, Vera Bühlmann, with Sebastian Michael, Diana Alvarez-Marin, Miro Roman
from the preamble:
Orlando – figment of the imagination, ideal and idol and fallible in every way conceivable but flawless in the eye of the beholder – is given to the world perfectly formed by the gods, themselves constructs of the human endeavour to conquer the unknowable and unknown. Timeless, ageless, and deriving immense powers mostly from an indomitable spirit paired with an enquiring mind, Orlando is all human, all humanity, all humility and all pride: an articulation of the embodied consciousness we may call the experience of being alive. Not good or bad, nor beyond the pale is Orlando, Orlando is wonder and discovery and surprise; and strife for self and self-knowledge and hunger for connections that mean something; and need for identity, desire for the loss of self and urge for survival; and yearning for the tender release that is death and fear of the violent crash into the absence of life that is dying. And aching for a place in history and undoing that history bit by bit. And invention, creation, as much as destruction. And cruelty and kindness and the duality of all things polar and their fusion. And the idea of being itself. (Never even mind religion and statehood and status and tribe and the blood ties that bind and sin and redemption or even forgiveness.) Orlando is all made up which is why Orlando is real, and Orlando, of course, is ancient as much as Orlando is new. Orlando is charged by the gods – subject as they are to their own whims and fancies and with wisdom endowed no more and no less than we can conceive – to embark on a quest to The City. And so, as we go to The City, our protagonist shall be Orlando…
from the book cover:
We are all nomads
native to the universe.
This is a stage play,
a narrative about us on the planet.
About how we relate to each other
and to the Great Masters among us.
Welcome to The City.
The views are wide open and bright.
Cities are powerful and challenging.
The heights are lofty the abysses are deep.
Take a seat!
Here’s the setting: A planet.
The generic city.
And 100% urbanism.
Raise the curtain!
Everything is connected.
Everything imparts everything.
The self and the other.
Good and evil.
Adland perfection to bad news provocation.
The burning pain of aching souls versus the purity of nature.
Catastrophe and salvation circling each other forever in their merry-go-round.
A Venetian Carnival: Masks, murder, love, perfidy and beauty.
What should I do,
if I am capable of anything
but have no idea what to do?
A Quantum City invites you to tap into the wealth of indexes belonging to our world. You get introduced to Orlando, a person with no noteworthy qualities, nor any particular properties: a human being who has not yet travelled. And it’s because of this that Orlando is singled out by the gods. He sets sail from Crete towards Athens in 320 BCE, hoping to find evidence of perfection. Throughout the book you follow him on his Odyssey through Western civilisation; though Orlando never quite ends up where she intended to go. And yet, by the time she arrives in the New York of the 1960s, all the decisions that have been made must be called hers. Orlando’s adventure is to challenge the collective origin of intellectual nature. In doing so, Orlando becomes neither an authoritarian functionary, nor a restless activist, nor a comfortable member of a bourgeoisie, but a citizen of the digital age, a Quantum Citizen.
This is not a book as you might expect. It doesn’t offer a theory about cities; rather it speaks of any theory. It is not engaged in solving problems, but it is outraged at the kind of stupidity that cultivates ignorance, at the oppressive and anonymous demand that any solid formulation of a problem should be simple. And above all it takes you onto a journey to (re-)discover The City…
A joint research project by:
The Future Cities Lab, NUS and ETH, Singapore
Research Blog with working materials: http://blogs.ethz.ch/prespecific/
»Digital Lineamenta«
Computational modeling in architecture based on a sheaf-theoretic, quantum-semiotic measurement framework
Excerpt from the abstract to a joint research grant proposal for a cooperation with CPNS Center for Philosophy and the Natural Sciences at California State University, Sacramento (freshly submitted, and still in the processes of consideration):
Today, the architectural model and the built building tend to be both equally much regarded as ‘models’: the omnipresence of images seems to make it superfluous that the ‘manners of appearing’ demands the physical reality as a kind of proof. Thereby, the model is not enriched, it loses its very character: namely a particular kind of ‚speculative potency’.
(Werner Oechslin, »Das Architekturmodell – Idea Materialis«, in: Die Medien und die Architektur. Hg. von Wolfgang Sonne, Berlin, München, Deutscher Kunstverlag, 2011, S. 131-155 (my own translation from German, VB))
Models are in demand where abstract ideas play a part and need to be rationalized and communicated. The legacy of architectural models is in closer familiarity with method – with mathesis – than with a modern understanding of theory as a framework for explanation (Werner Oechslin). Its entire reason of existence is to lend itself for active communication in media res, not merely as a reflective explanation post rem or as a normative prescription ante rem. It is precisely this character of modesty, openness to speculation coupled with a firm sobriety towards all too fantastical flights of fancy that seems to have come out of fashion and – as some argue (e.g. Mario Carpo) – out of service within the contemporary paradigm of digital modeling and its fascination for a computer graphically induced Virtual Reality: the omnipresence of images seems to make the notion of the model, with its in media res character, superfluous. This project shares the view of Werner Oechslin who insists that this is not a gain, but an impoverishment for architecture. It pursues the following idea of how this legacy may be continued: it will explore the peculiarly »manifest« and »physical« character of a computational model in terms of recent innovations in quantum information theory and its underlying topological structure, and the manifestation of this structure in solid state physics. We intend to translate mathematically sophisticated but state of the art procedures from quantum physics into the corpus of architectural categories. Our proposition is to introduce »quantum-geometrical spectrums« and »topological phases« into the discourse. This entails that we introduce notions of code, signals, and their constitutive cryptological/analytical character into architectural theory. Like this, we intend to work towards a quantum semiotic understanding of measurement for 21st century architectural theory. The formalism for this quantum semiotic measurement framework will be grounded in a topological interpretation of quantum theory which applies sheaf theoretic topology. The architectonic perspective upon this quantum semiotic measurement framework will be grounded in a materialist informatics perspective on code, and a media theoretic interpretation of the role of encryption/decipherment in information technology. Both the formalism and the architectonic perspective will be translated to and grounded in the field of computational modeling in architecture.
The value of this basic research to the larger academic public is that it promises to contribute to the development of an understanding of theory that does not fall back into structuralist frameworks (even if by deferral or negation of them, such as poststructuralism). Our proposition is that the predominancy of linearity (even if complexly intertwined) can be replaced with a predominancy of spectrums. This allows us to think networks not as derivatives of linearity, but the other way around: linear connections can be considered as contingent renderings out of concrete, yet quantum-geometrically indefinite, potentia. In architecture, this means that the more or less latent totalitarian genericness of the rapidly spreading practice of »parametricism« can be checked, cultivated and controlled, while appropriating and affirming the digital means on which it works.
CPNS Center for Philosophy and the Natural Sciences, California State University, Sacramento
With CPNS, we share a common interest in developing a notion of architectonic modeling which we call “Digital Lineamenta”, based on a sheave-theoretic, quantum-semiotic measurement framework.
Since the lines separating philosophy and science have all but vanished with respect to recent explorations of fundamental questions (e.g., string theory, multiverse cosmologies, complexity-emergence theories, the nature of mind, etc…), the modern breakdown of ‘natural philosophy’ into the divorced partners ‘philosophy’ and ‘science’ must be rigorously re-examined. (Michael Epperson, Founder and Head of CPNS)
The Center for Philosophy and the Natural Sciences, in affiliation with the College of Natural Sciences and Mathematics at California State University Sacramento and the Institute of Mathematics at the University of Athens, Greece, engages in research and scholarship that explores the philosophical implications underlying recent innovations in contemporary science, including those occurring in the areas of quantum physics, cosmology, and the study of complex adaptive systems. This exploration is, in part, a speculative philosophical enterprise intended to contribute to the framework of a suitable bridge by which scientific and philosophical concepts might not only be cross-joined, but mutually supported.
In addition to our research, teaching, and publication initiatives, our mission is to foster and enhance the understanding of modern science and its philosophical, cultural, and social implications. Our work in this regard extends beyond the scholarly community and into the arena of public discourse and public policy. Topics will include the role of science and technology in society, environmental science and policy, bioethics, science education, and other practical applications.
Society for the Study of Bio-political Futures
About the Society
“The Society for the Study of Biopolitical Futures locates itself at the nexus where canonical biopolitical thought needs to be renovated and rejuvenated because it can no longer adequately address the issues and questions—of life and the living, of biopower, of the nature of the political itself—that biopolitical thought itself has raised.”
– Cary Wolfe, co-convenor
ALIVE: Advancements in Adaptive Architecture
ALIVE: Advancements in Adaptive Architecture
edited by Manuel Kretzer and Ludger Hovestadt,
is out and can now be ordered via Amazon.
a collection of essays by leading researchers and practitioners:
Philip Beesley, Jason Bruges, Nicola Burggraf, Vera Bühlmann, Carole Collet, Martina Decker, Stefan Dulman, Alex Haw, Ludger Hovestadt, Tomasz Jaskiewicz, Branko Kolarevic, Manuel Kretzer, Oliver David Krieg, Areti Markopoulou, Achim Menges, Aurélie Mossé, Kas Oosterhuis, Claudia Pasquero, Marco Poletto, Steffen Reichert, Jose Sanchez, John Sarik
Applied Virtuality Book Series, Vol. 8
ISBN 978–3–99043–667–7
In times where the very concept of ‘nature’ is questioned not only in its philosophical dimension, but in the core of its biological materiality, we need to reconsider the interrelations between architecture and nature. This not only applies to strategies on environmental responsibility but equally on anticipatory human behavior, transient occupation and cultural or demographic variety. To address these challenges this book proposes to embrace the unknown and cultivate the architectural discipline towards an integrated, co-operative and cross-disciplinary practice that responds to natural evolution through more than formal imitation. It unravels compelling innovative and forward-thinking design narratives by leading international practitioners and researchers who investigate novel associations between architecture, nature and humanity for a future, alive architecture. Structured around the three closely cross-linked core themes “bioinspiration,” “materiability,” and “intelligence” the book engages with the starting point of an emerging new design field, where the symbiosis of physics, biology, computing and design promises the redefinition of what we call architecture today.
CEP Center for Expanded Poetics, Concordia University, Montréal
The Centre for Expanded Poetics is a creative research laboratory for the interdisciplinary study of structure, form, and fabrication. Initiated in 2014, the Centre for Expanded Poetics is based in the Department of English at Concordia University, Montréal.
A DAY AT THE CAAD CHAIR // BY THE MAS 2012/13 CLASS
Master of Advanced Studies in CAAD: Architecture and Information
PROGRAM OVERVIEW
This MAS class is a full-time one-year interdisciplinary class of about 12 graduate students interested in research on the next level of Computer Aided Architectural Design. This class contains 7 modules in theory, in basic skills about theory in technology and architecture, programming, electronics and CNC production of architectural artifacts. The main interest of the research is the reflection on the potentiality of the upcoming technologies for future architecture. The class starts on an abstract theoretical and philosophical level and ends in exercises in designing concepts of future architecture on the so called symbolic level.
WHAT’S NEXT?
Today, information technology is ubiquitous. Most architects have a self-taught working knowledge of visualisation and computer-aided modelling techniques. In some places, there are specialised technical programmes, especially in the areas of parametric design and experimental computer-generated production. This specialist knowledge is not sufficient, however, to keep track of the medial, technical, organisational, economical and political developments in architecture. Information technology has become a driving force in every sphere of activity for architects. But these developments are as yet badly understood, and so their interpretation is narrow and the architectural landscape diffuse.
This programme is directed at architects, designers and creative people. It offers, for the first time, not technical specialisation but architectural integration on a higher technical level. It conveys profound insights into a variety of technical areas and prompts theoretical reflection as well as promoting an independent personal stance.
The programme is demanding. Technologies are becoming ever simpler and more accessible, but defining an individual position for an architect is becoming more and more difficult. We offer no formulas or solutions. We mistrust the attitude, taken by MIT for example, that popularises, and in doing so naturalises, technology. This, to our minds, amounts to a positioning for power by way of simplification: complexities are being externalised. We believe that this is not enough: technological creation has to be complemented by expertise, not just in technology, but also in creation.
FULL INFORMATION VISIT THE CAAD WEBSITE AND HAVE A LOOK AT THE PRINT DOCUMENTATION
MODULE 4 MAS 2010/11 // ARCHITECTURE AND THEORY
ARCHITECTS REVISITED
“The second theory module will revisit different topoi of” “architectural theory. The students will work out conceptu” “al schemas, which will allow them to compare different positions of architectural theory. They will proceed by case studies for example on Palladio’s approaches to spatial grammar and syntax, on the cosmic scope of French Revolutionary Architecture, on Durant’s rationalization as well as on more contemporary approaches like the machine à habiter (le Corbusier), The Architecture of Well-Tempered Environment (Reyner Banham), Mechanization Takes Command (Sigfried Giedion) or more recent approaches like parametricism (studio Zaha Hadid), Junkspace (Rem Koolhaas), or The Function of Form (Farshid Moussavi)” “a.o.” The students approaches in analytically formalizing these case studies will be prepared for synthesis and modalization. They will learn how to make these topoi more readily accessible and reconstructible by a sort of “conceptual cross-breeding” of these approaches. In a final exercise, each student will take one approach and reformulate it according to his or her own attitude, or to a fictitious attitude especially conceived and characterized for this occasion. Like this, the students are asked to produce and represent their own written manifestos. The module will start with recapitulating the achievements of the first theory module: what is at stake in the concepts of an architectonics of growth and a general theory of stratification? What is the relevancy of concepts such as the plane of consistency, the abstract machine, or double articulation regarding the power of contemporary information technology and the design space that goes along with this kind of technology and infrastructure?
Keeping these aspects in mind, the students will be introduced to a comparatistic way of engaging with architectural history and theory. They will analyze what kind of values certain theories have regarded as elementary, what emphasis have been put where in different theoretic edifices, and what kind of schemes and concepts they have proposed as mediating between these dimensions. Furthermore they will look at how the technological conditions predominant for different times reveal their impact in particular architectural manifestos and theoretical models. Especially, we will be concerned with how the different numerical spaces incorporated in the respective technological paradigms allow for different kinds of conception and construction principles, and also different paradigms of theoretical reflection.
The students will be trained in developing a sense of distinction and comparison between the spaces of potentials and constraints that different “renderings” of such construction principles allow as a design space for architecture. A great emphasis of the course will lie on analyzing the role of technology for architectural theories, as well as the different attitudes taken towards technology therein.
Module 1 MAS 2010/2011 // Information and Architecture
LIVING IN A WORLD OF ABUNDANT POTENTIALS
The first module asks about the use and the possibility of a theory for architecture, with a specific application to and perspective on CAAD. We will be concerned with the relation between architectonics as a methodological, philosophical frame of reference and its application in architectural practice, both from a historical as well as from a structural point of view. We will especially look at the role technology plays in that relation. Different from mechanical technology which operates on the substrate of physical forces, information technology takes as its substrate information. From an architectonic point of view, how can we orientate ourselves when constructing within the symbolic and its potentiality?
The rising importance of concepts like network, field, stratum and plateau a.o. point out that we are learning to maintain a different relation with geometry and quantification, and the respective socio-political forms of organization in space and time (territorialization/ deterritorialization). On a methodological level, the possibility of synthesizing series and constructing sequences as genuinely analytical elements point to the possibility of an abstraction from the geometrical elements and the mechanical methods of aligning them. The students will be introduced to the larger problematics behind the profound changes that are characteristic for our time, as well as to some key concepts and methods for approaching and dealing with them.
The introduction into these backgrounds will help the students to develop an own position and attitude as a future architect. These embedding concepts are key for learning how to design and construct within the space of abundant potentials that the symbolic is. A great emphasis of the course will lie on training the students in a kind of architectonic close reading of demanding texts. Furthermore they will re-visit conceptual elements of their daily practice such as a plan, a line, a surface from a fresh perspective. In a final exercise the students will work out a manifesto for a general architectonics for narrative infrastructures within the symbolic. The results will be presented to the CAAD group at ETH, as well as to a final guest critic.
Module 7 MAS 2011/12 // Architecture and I
TIME AND SPACE ENGENDERED BY ARTIFACTS: ARTIFACTS AS SPECIFYING OPERATORS
Artefacts mobilize spaces‘ and times‘ uniformity into an open scope and infinitesimal range of possible arrangements, foldings, compartimentability. With a non-romanticizing eye, we want to look at spaces of intense experience, under the following methodological assumptions: Grammar provides the possibility structure for what can be expressed in language. We will look specifically at two aspects: grammatical cases and articles. While the latter determine the definiteness or indexicality of nouns (a, this, none etc), cases provide the verbs with a voice (passive, active, medium), make possible subjects and objects of happenings distinguishable and relateable in a manner of ways (nominative, dative, instrumental, etc), and are capable of expressing circumstantial information as position or duration in space and time. In this module we will regard urban activities as verbs that engender cases, and we will regard artefacts as the specifying operators of such engendering. It is the goal of the module that each student develops the prototype of a conceptual grammar for expressing activities‘ (verbs‘) excertable inflections on things (nouns). The experimental aspect will be to depart from an unusual perspective: we will especially be interested in specifically urban activities enhanced with technological appliances, like a hairdryer, a rice cooker, a moped, a theater space, etc. We will try to read technology in terms of auxiliary structures for expressing new verbal forms and assume that articulating intensities of experience means expressing the different forms actuality can take.
Understanding the scope of grammatical categories for what can be expressed in a way allowing for generalization, the students will gain a powerful toolbox for articulating their own brands and identities as globally active architects. We will start with an amateur-adventure-tour through the „genesis“ of generalizing and abstracting: in what situations and contexts, and by whom, were ideas like the postulational method for investigation, integration of areas by approximation, the coordination of points, the derivation of functions, symmetry groups, variational calculus and invariants, the generality of algebraic formula, the binary code for symbolic logic, or the distinction between cardinal, ordinal or even ideal numbers invented? Each student will read one chapter (or two short ones) in E.T. Bells Men of Mathematics, The Lives and Achievements of the Great Mathematicians from Zeno to Poincaré (Touchstone 1986 [1937]) and present the core concepts in a few slides. Like this we will gather a catalog of powerful schemata of how to generalize and abstract in an enriching, not in an impoverishing way. We can regard these schemata as tools for learning to think, articulate, and dope situations.
EXAMPLES OF STUDENT WORK
Katia Ageva: Body of knowledge
Diana Alvarez-Marin: AutoMOBILE, Destruction of a myth
Grete Soosula: Reach Through not Nearing
Module 4 MAS 2011/12 // Theory and Architecture
ANY OF ALL
Every building practice needs a smallest unit to bring things into proportion and controllable relations. Traditionally, these units are derived by setting some defined magnitude as elementray. The paradigmatic example is the so called column modules in the Art of Greek Temple Building, from which the ratio can be derived and declinated across scales to put the whole building into proportions. Today we are working with computers, where the elementary units are bits. Bits are the kind of units which render information into a technologically handable quantity. Yet bits are literally speaking a very strange thing – looked at within the language game of quantities, they are finite formal units of determinacity, or pure determinability. Hence we will call them Any-Bits, or Intensive Quantities. How exactly can they be thought to fit within the language game of quantities, magnitudes, numbers? The second theory module will try to get an idea of this context by revisiting different topoi of architectural theory.We will be looking at the different roles and concepts proportion, means, or ratio have played in architecture over time. How were they conceived? derived? applied? altered? legitimated? articulated? The students will proceed by case studies for example on Palladio’s approaches to spatial grammar and syntax, on the cosmic scope of French Revolutionary Architecture, on Durant’s rationalization as well as on more contemporary approaches like the machine à habiter (le Corbusier), The Architecture of Well-Tempered Environment (Reyner Banham), Mechanization Takes Command (Sigfried Giedion) or more recent approaches like Parametricism (studio Zaha Hadid), Junkspace (Rem Koolhaas), or The Function of Form (Farshid Moussavi) a.o. By studying the systematicity-and-proportion question in their architectural heroes, the students reconstruct how The Architectural Whole has been articulated individually by different architects. The students will learn how to make these topoi more readily accessible and reconstructible by a sort of “conceptual cross-breeding” of these approaches. Each student will work with one approach throughout the entire module and frequently present his or her work in progress to the class. As a final result, each student will produce a comic booklet which will together form the MAS 2012 Comix Series ANY OF ALL. The students will be introduced to a diagrammatic and comparatistic way of engaging with architectural history and theory. They will analyze what kind of values certain theories have regarded as elementary, what emphasis have been put where in different theoretic edifices, and what kind of schemes and concepts they have proposed as mediating between these dimensions. Furthermore they will look at how the technological conditions predominant for different times reveal their impact in particular architectural manifestos and theoretical models. Especially, we will be concerned with how the different numerical spaces incorporated in the respective technological paradigms allow for different kinds of conception and construction principles, and also different paradigms of theoretical reflection.
EXAMPLES of Student Work
Diana Alvarez-Marin on Rem Koolhaas
Melina Mezari on Etienne-Louis Boullée
Stylianos Psaltis on Peter Eisenman
Module 1 MAS 2011/12 // Information and Architecture
BEYOND ENTROPY: WHEN ENERGY BECOMES FORM
The first theory module will introduce you to some large-scale perspectives for thinking about architectural questions specific to our contemporary information age. Beyond following one of the many trends that have emerged within recent years like parametric and algorithmic design, digital tectonics and materialism, we will take a more abstract view on computers and look at information technology from the perspective of infrastructures. Many new tools have been introduced and meanwhile made accessible for architects in a professional, ready-to-use format. Once you can articulate, formulate and communicate what you want to do, as an architect, the technical steps to realize it can (comparatively) easily be organized with the help of CAAD/CAM, open source and open design communities.What turns out to be most challenging today concerns what kinds of questions or horizons we can frame for our experiments, applications or projects. Yet how to give shape to visions of future lifeworlds beyond concrete utopia? The perspective we will be concerned with regards information technology within a generational history of technology. The module will introduce you to this theoretical model, and elaborate its basic arguments. The core assumption thereby is that information technology, different from mechanical technology and its diverse machines and apparatuses, is algebraic and operates on the substrate of an interplay between electricity and information. Our starting point is twofold: information, in a technical sense, can be regarded as the formal abstraction of any content-as-representation into a symbolically operative format (digital code); electricity can be regarded as the abstraction of energy from its concrete material storage into a symbolically operative format. The work information technology is able to carry out is productive within the symbolic, prior to being rendered into materiality. And now consider this: The sun sends 10‘000 times more energy to the earth as all of mankind is currently using. Daily. Photovoltaics puts us in the historically singular position that we need no longer rely on exploiting the natural storages, we can harvest solar energy by tapping into the solar stream directly. Of course it will still hold that the total amount of energy in the universe can be considered constant, yet the amount of energy encapsulated within agricultural growth and urban cultivation is not. It seems hardly an exaggeration to say that this changes the way we relate between culture and nature: with regard to the natural storages, we can harvest, store and integrate an abundant amount of surplus energy into our cultural milieus. Our hypothesis for architecture is that the so-called information age gives rise to emerging forms of solar societies, for which an abundance of clean energy will be characteristic.
Energy can be decoupled from resources. With networked, information technological infrastructures, it turns into a problem of logistics. Thanks to the control of electricity by information technology, energy can be rendered into any form of energy: potential, kinetic, chemical, thermic. We can move things, transform substances, deliver messages, install and operate infrastructures of nearly any kind. But how can we start to think about the forms of living and building in solar societies? There may be, for the time being, no notions of common sense in sight of how to integrate this technological feasibility and genuinely symbolic artificiality into meaningful horizons. All the more is it exciting and important to work conceptually on how to delineate and refer to these novel consistencies that are genuinely symbolic. This first theory module aims at gathering, discussing and refining some crucial vocabulary for talking about our contemporary world.
READINGS
Ludger Hovestadt: A fantastic genealogy of the printable, in V. Bühlmann, L.Hovestadt, Printed Physics, Applied Virtuality Vol. I, Birkhäuser Basel 2011.
Vera Bühlmann: Primary abundance, urban philosophy. Information and the form of actuality in V. Bühlmann, L. Hovestadt, Printed Physics, Applied Virtuality Vol. I, Birkhäuser Basel 2011.
Ludger Hovestadt and Vera Bühlmann, The Power Book. A Radical Pathway from Energy Crisis to Energy Culture (forthcoming, draft version on the server)
Peter Sloterdijk: What happened in the 20th century? Cultural Politics: an International Journal, Volume 3, Number 3, November 2007.
George Bataille: The meaning of general economy. In: The Accursed Share: An Essay On General Economy. Volume I: Consumption
George Bataille: Laws of a general economy. In: The Accursed Share: An Essay On General Economy. Volume I: Consumption.
Henri Lefebvre: From city to urban society in: The urban revolution. University of Minnesota Press 2003.
Gilles Deleuze and Félix Guattari: The Geology of Morals (What does the Earth think it is?) in: A thousand plateaus. Capitalism and Schizophrenia II. Continuum Press, London and New York 2004.
Gilles Deleuze and Félix Guattari: Apparatus of Capture. in: A thousand plateaus. Capitalism and Schizophrenia II. Continuum Press, London and New York 2004.
Gilles Deleuze and Félix Guattari: Micropolitics and Segmentarity. in: A thousand plateaus. Capitalism and Schizophrenia II. Continuum Press, London and New York 2004.
Monday September 27th – Friday October 21st
Seminar meetings daily: 2-6 pm -> this as a rule, be prepared for spontaneous changes (timewise)
Guest lectures: to be announced.
A COLLECTIVE VIDEO FEATURING EXCERPTS FROM THE FINAL EXERCISES:
<p><a href=”https://vimeo.com/31427213″>M1_final video_trailer</a> from <a href=”https://vimeo.com/mascaad”>MAS CAAD</a> on <a href=”https://vimeo.com”>Vimeo</a>.</p>MODULE 3 MAS 2012/13 // THEORY AND ARCHITECTURE
GRAMMARS AND LOCIGS OF FORMS
Since antiquity and throughout different cultures, the theoretical study of forms is bound up with an ideality that can (somehow) equip us with forms as templates to anticipate happenings, estimate consequences, express desires and plan accordingly – in short, to organize our experience of reality in terms of sequences, series, orders. In this module on architecture and theory within the paradigm of information technology we will take a comparatistic perspective on the issue of form and formality, based on the following hypothetical narrative:
Let us assume that the theoretical study of forms originates with people starting to consider forms abstracted from things, from their immediate corporeal presence, and projecting them – theorematically – into a realm of ideality where, for the sake of the hypotheticity at stake in such abstraction, time does not pass, presence is virtual (not corporeal), and nil corruption prevails. Let us assume furthermore that this narrative of origination does not mark a particular moment in time as the beginning or end of an unfolding story of progress in intellection, but that it incorporates a theme which gains actuality time and time again in different manners. In short, we will assume that there are different symbolic constitutions proper to different conceptions of such ideality and formality. To each of these symbolic constitutions corresponds what we will call – in analogy to the Kantian forms of intuition (space and time) – a form of partitioning (eine Urteilsform) which manifests itself in how we think about counting and measuring, by numbers, magnitudes, units. Let us refer to the respective symbolicness of these constitution in analogy to distinctions well established in number theory as a) natural, b) integral (including amendments regarding zero and negatives), c) real (including amendments regarding infinitesimals and transcendentals), d) complex (including amendments regarding imaginaries), and from here we find ourselves, on the further levels of abstraction with the various algebraic number fields, domains, rings, in what we could call – if it were not, inevitably, an intolerable naturalization of the symbolic – a veritable Cambrium Period in the Geology of Symbolic Ideality.
Looking at the constitution and the role of series in architecture M three While our main interest is, of course, in the algebraic constitutions marked as d) and following, we will focus in this module mainly on the symbolic constitutions a) to c), because those are the ones we can study by learning to read the artifacts they have produced. We will try to develop and train a literacy based on studying how the different forms of partitioning allow to govern a subject matter, by establishing different regimes of operation which we will call, allegorically, the geometry of form, the arithmetics of form, and the algebra of form. Within each of these forms of governance, what can be identified as “simple” and as “complex” varies categorically: within a geometry of forms (which we can allegorically relate to Euclid) a set of figures can be considered elements; within an arithmetics of forms (whose allegorical persona may be seen in Descartes) certain values and their ranges are considered absolute; within an algebra of forms it is the domains over which the values range that gain a fundamental role (this last paradigm is too recent to ascribe it an allegorical persona – certainly Galois would be a candidate, but also Euler, Gauss, Cayley, Dirichlet or Dedekind).
We will try to develop and train such a literacy by looking at forms and their symbolic constitutions within the registers of Grammar and of Logics. We will assume that a Grammar of Forms treats forms in terms of inclination and conjugation according to morphological and syntactical rules, and a Logics of Forms treats forms in terms of their nesting and integration into more complex constructions. Schematically speaking, we will assume that Grammars allow for and aim at the differentiation of forms through conservative modulation, and Logics allow for and aims at the differentiation of forms through integrating novel elements, values or ideas. Our core field of empirical investigations to study grammars will be the architectural artifacts of renaissance and classicism, and for studying different logics of forms we will look mainly at artifacts from baroque and historicist architecture.
The concrete task for each student is to choose one “topos of architectural artifacts” – the villa, the garden, the church, the palace, the market place, the street, etc – and work out a plausible series that can account for the differentiation of their topos throughout different regimes of operation, and the respective grammars and logics applied thereby.
Monday January 7th – Friday January 31st 2013 Seminar meetings daily: 3-5 pm -> this as a rule, be prepared for spontaneous changes (timewise)
GUEST WORKSHOP IN EXPERIMENTAL FORMS OF LITERACY:
Unplanned Cinema – metropolis, circulation and method -> Guest lectured by Evan Calder Williams, January 28th – 31st 2013
This workshop centers on three intertwined histories – the cinema, capitalism, and the metropolis – to argue how the same problem structures all three: namely, how can the past, the frozen, the negated, and the accumulated be made to produce anew? And what does that generation leave behind? More prosaically, we will confront the issue of circulation as a way to give an account of economic forms, cultural forms, and the possible, though obscure and unassured, bond between the two. Even more simply, we’ll try to figure out, amongst other things, what the history of the action movie chase scene has to do with the history of city planning. In the workshop, we’ll do two main things: consider an argument together and develop methods of unconventional watching. First, we’ll work through a different approach to thinking about that well-established link between the metropolis and cinema, leaving behind more familiar accounts, such as how the cinema provides reflection on the shocks of experience and spatial dislocation during urbanization. Instead, reading both metropolis and cinema as forms of the generative tension between the static and the animated, we’ll see how the city – the advanced form of capital’s structure of circulation – finds expression in the cinema – the advanced form of reflection on and elaboration of that form – but, crucially, not because a film may take place in a city. The second thing we will do, in order to look at this tension and relation is to closely watch films. More specifically, we will watch not films as individual texts to be decoded but as sets of passages and tendencies, moments in a scattered history of social and spatial experience. For this reason, we’ll consider a set of fragments, genres, recurrent moments, and landscape. The stress will be on developing a method of watching adequate to this emphasis on circulation and style, a method that will attempt to denaturalize the temporal and narrative habits we have: fast-forward, wat ch out of order, takes films as collections of stills, slow things down until we can see them not as reflections on a life but elaborations what has been hiding in plain view.
Evan Calder Williams is a writer. His texts, talks, and performances deal with horror, technique, ornament, capital, and negation. He is the author of Combined and Uneven Apocalypse and Roman Letters. He is a Fulbright Fellow in Film Studies in Italy, where he is writing a dissertation on “anti-political” cinema in the 1970s. He writes for Film Quarterly, Mute, The New Inquiry, and Machete, and at his blog Socialism and/or Barbarism.
GUEST LECTURE BY NATHAN BROWN, UC DAVIS SACRAMENTO CALIFORNIA, USA
on Capitalims
(readings list and video documentation follow soon)
MODULE 1 MAS 2013/14 // INFORMATION AND ARCHITECTURE
WELCOME TO THE SOLAR SOCIETY – LEARNING TO CONSIDER THE FORM OF ACTUALITY
The first theory module will introduce you to some large-scale perspectives for thinking about architectural questions specific to our contemporary information age. Beyond following one of the many trends that have emerged within recent years, like parametric and algorithmic design, digital tectonics and digital materialism, we will take a more abstract view on computers, computation and computability, and look at information technology from the perspective of its theoretical principles and their practical instantiation in form of the logistical, and increasingly global, infrastructures. As starting point, consider this: The sun sends 10‘000 times more energy to the earth as all of mankind is currently using. Daily. Photovoltaics puts us in the historically singular position that we need no longer rely on exploiting the natural storages, we can harvest solar energy by tapping into the solar stream directly. Of course it will still hold that the total amount of energy in the universe can be considered constant, yet the amount of energy encapsulated within agricultural growth and urban cultivation is not. It seems hardly an exaggeration to say that this changes the way we relate between culture and nature: with regard to the natural storages, we can harvest, store and integrate an abundant amount of surplus energy into our cultural milieus. Our hypothesis for architecture is that the so-called information age gives rise to emerging forms of solar societies, for which an abundance of clean energy will be characteristic. Energy can be decoupled from resources. With networked, information technological infrastructures, it turns into a problem of logistics. Thanks to the control of electricity by information technology, energy can be rendered into any form of energy: potential, kinetic, chemical, thermic. We can move things, transform substances, deliver messages, install and operate infrastructures of nearly any kind. But how can we start to think about the forms of living and building in solar societies? There may be, for the time being, no notions of common sense in sight of how to integrate this technological feasibility and genuinely symbolic artificiality into meaningful horizons. All the more is it exciting and important to work conceptually on how to delineate and refer to these novel consistencies that are genuinely symbolic.
This first theory module aims at gathering, discussing and refining some crucial concepts for talking about our contemporary world. We will be guided by the assumption that the crucial aspect about dealing with concepts is not (primarily) understanding the „right“ meanings or definitions, but learning from them about the acts of abstraction involved in conception (thought). Hence, don‘t be afraid of abstract thought! Information technology operates physically and directly on a symbolic substrate. In this, it differes from any earlier technical Gestalt. This „informal“ substrate consists of a restless interplay between electricity and information. We will try to explore two principle consideration. The first: information, in a technical sense, can be regarded as the formal abstraction of any specific content (as representation) into a not primarily representative, but symbolically operative format (digital code); the second: electricity can be regarded as the abstraction of energy from its concrete material storage into a symbolically operative format. The mark of distinction of information technology is that the work it is able to carry out is productive within the symbolic, prior to being rendered into physical materiality. In other, arithmetical words: multiplication is poorly understood if treated as the repetition of addition. A product cannot be comprehensivley described as a sum; each such description is merely one rendering of a product‘s algebraic/symbolic „identity“ – which can be „articulated“ in many ways on the basis of its possible factorizations. Hence, while considering artifacts, we must assume not only a (historical) autonomy of objects, but also ascribe them a proper integrity.
OUR KEY POINTS OF INTEREST
the rôle of abstraction
the notion of the pre-specific in distinction to genericity, generality, specificity, particularity, individuality, singularity, universality, totality
the means of abstraction
the notion of articulation within the interplay between grammar, logics, mathematics, poetics, technics, mediality
the bodies of abstraction
the notion of artefacts in relation to that of things, facts, or essences
orientation within the world of abstraction
indexing: the here and now, in relation to then and there.
Wednesday Sept. 19th – Friday Oct. 26th Seminar meetings daily: 2.30- approx. 6 pm -> this as a rule, be prepared for spontaneous changes (timewise)
OUR CORE READINGS
Gilles Deleuze and Félix Guattari: The Rhizome, in: A thousand plateaus. Capitalism and Schizophrenia II. Continuum Press, London and New York 2004.
Gilles Deleuze and Félix Guattari: The Geology of Morals (What does the Earth think it is?) in: A thousand plateaus. Capitalism and Schizophrenia II. Continuum Press, London and New York 2004.
MODULE 1 2012/13 // ARCHITECTURE AND INFORMATION
THEORY AS INSTRUMENTAL FOR LEARNING The first theory module will introduce you to some large-scale perspectives for thinking about architectural questions specific to our contemporary information age. Beyond following one of the many trends that have emerged within recent years like parametric and algorithmic design, digital tectonics and materialism, we will take a more abstract view on computers and look at information technology from the perspective of infrastructures. Many new tools have been introduced and meanwhile made accessible for architects in a professional, ready-to-use format. Once you can articulate, formulate and communicate what you want to do, as an architect, the technical steps to realize it can (comparatively) easily be organized with the help of CAAD/ CAM, open source and open design communities.What turns out to be most challenging today concerns what kinds of questions or horizons we can frame for our experiments, applications or projects. This inverts the role of theory as being in the service of practice to practice being in the service of theory. We don‘t see the value of theory anymore in providing collective perspectives on given circumstances, but in providing exercises that help developing individually the intellectual capabilities to view and consider the complexities of a situation. WITHIN IN AN ECONOMY OF ENTROPY The sun sends 10‘000 times more energy to the earth as all of mankind is currently using. Daily. Photovoltaics puts us in the historically singular position that we need no longer rely on exploiting the natural storages, we can harvest solar energy by tapping into the solar stream directly. Of course it will still hold that the total amount of energy in the universe can be considered constant, yet the amount of energy encapsulated within agricultural growth and urban cultivation is not. It seems hardly an exaggeration to say that this changes the way we relate between culture and nature: with regard to the natural storages, we can harvest, store and integrate an abundant amount of surplus energy into our cultural milieus. Our hypothesis for architecture is that the so-called information age gives rise to emerging forms of solar societies, for which an abundance of clean energy will be characteristic. Energy can be decoupled from resources. With networked, information technological infrastructures, it turns into a problem of logistics. Thanks to the control of electricity by information technology, energy can be rendered into any form of energy: potential, kinetic, chemical, thermic. We can move things, transform substances, deliver messages, install and operate infrastructures of nearly any kind. But how can we start to think about the forms of living and building in solar societies? GLOBALIZATION AND COMPUTATIONAL SYNTHESIS – ENGENDERING THE EARTH IN ITS KIND There may be, for the time being, no notions of common sense in sight of how to integrate this technological feasibility and genuinely symbolic artificiality into meaningful horizons. All the more is it exciting and important to work conceptually on how to delineate and refer to these novel consistencies that are genuinely symbolic. This first theory module aims at gathering, discussing and refining some crucial vocabulary for talking about our contemporary world. A great emphasis of the course will lie on training the students in a kind of architectonic close reading of demanding texts. Our core reading will be Geology of morals – what does the earth think it is? by Gilles Deleuze and Félix Guattari. This text not only provides a wealth of useful concepts to think through matters of general concern, as for example raised by Peter Sloterdijk in What happened in the 20th century, and in Geometry in the Colossal: the Project of Metaphysical Globalization, but it also operates – arguably – in a manner that is not primarily descriptive. The tactical way in which this text is crafted inevitably alienates the reader from trying to consent or dissent with the author‘s views, and just because of that it lends itself so well as an exercise in learning through encoding and decoding. One needs to learn handling the concepts proposed in it not unlike one needs to learn how to swim, or how to find one‘s own way around an immediate problem like wanting to create attention for a project: there is not direct and strictly rule-based way to achieve this. Thus what we will focus on and train in the first theory module is how to acquire literacy in conception, as resulting from appropriating (through training) various acts of abstraction. These acts will feel unnatural and uncomfortable, just like gymnastics in sports does. But like in the case of the latter, training in acts of abstraction will give you a „new“ body to think in. In a final exercise the students will work on presenting one or several of the architectonic concepts by dramatizing them as a stage play in short videoclips to share and distribute. The results will be presented to the CAAD group at ETH. Throughout the course, the students will be expected to present in class every second day a next step of how they want to dramatize the concepts of their choice. The feedbacks and the discussions will consider in plenum how to go about it next. Like this, everyone will be able to learn from the learning of the others in an intense way. Monday October 27th – Friday November 29th Seminar meetings daily: 2-6 pm -> this as a rule, be prepared for spontaneous changes (timewise) Guest lectures: in the form of workshops within our usual time schedule: Emiddio Vasquez – Nov. 6th: on Riemannian Manifolds and their role in Deleuze and Guattari‘s philosophy (University of Lisboa, Portugal). Dr. phil. Christina Vagt – Nov. 28th: Buckminster Fuller‘s Vector Cosmology. Technical University Berlin, 2013 visiting professor at Bauhaus Universität Weimar. Readings Peter Sloterdijk: What happened in the 20th century? Cultural Politics: an International Journal, Volume 3, Number 3, November 2007. Peter Sloterdijk: Geometry in the colossal: the project of metaphysical globalization. from: Sphären II Globen (Suhrkamp, 1999) pp 47-72. Translated by Samuel A Butler, Department of Philosophy, State University of New York at Stony Brook, NY 11794, USA. George Bataille: The meaning of general economy. In: The Accursed Share: An Essay On General Economy. Volume I: Consumption George Bataille: Laws of a general economy. In: The Accursed Share: An Essay On General Economy. Volume I: Consumption. MAIN READING: Gilles Deleuze and Félix Guattari: The Geology of Morals (What does the Earth think it is?) in: A thousand plateaus. Capitalism and Schizophrenia II. Continuum Press, London and New York 2004.
EXAMPLE OF A FINAL VIDEO BY JOEL LETKEMANN DRAMATIZING DELEUZE & GUATTARI’S “RHIZOME” AND “GEOLOGY OF MORALS” IN MILLE PLATEAUS, ENTITLED “WE”:
MAS THEORY CLASS 2014/15 // AN ARCHITECTONICS OF CRYSTALLIZATION
This is the new program for the theory modules I am teaching at the Master of Advanced Architecture Program “Architecture and Information” at CAAD, ETH Zürich.
“Over several centuries, from the Greeks to Kant, a revolution took place in philosophy: the subordination of time to movement was reversed, time ceases to be the measurement of normal movement, it increasingly appears for itself and creates paradoxical movements.“ (Gilles Deleuze, Cinema II, The Time Image)
Our emphasis in this year’s theory course lies on „time”. How can architecture relate to time in a manner that this relation to time informs architecture at large, i.e. not merely the eventual and inevitable aging of houses built, as the retrospective manifestation of architecture’s cultural history. Our hypothesis is the following: if time is no longer subjected to movement, we might learn to see the relative durations of stasis a built house is capable of capturing as being fabricated out of crystals of time. We will assume for such crystals to be constituted by virtually active elements – quantums, minimal units of measurability. In Physics, a quantum is understood as “the minimum amount of any physical entity involved in an interaction“. In likewise manner, we want to explore the postulate that „minimal amounts of any architectural entity involved in an interaction“ can be understood, by attending to crystals of time as the new elements of a new architecture.
The cosmological revolution that time cannot be deduced from the movement of things, but rather constitutes something like a general container for all that ever had or will have an extension in space, was the revolution brought about by Newtonian physics in the 18th century: „We may regard the present state of the universe as the effect of its past and the cause of its future.“ (Pierre Simon Laplace, Essai philosophique sur les probabilités, 1814). For the science and the philosophy of modernity, time has a General Form. Today, some 200 years later, the Quantum Paradigm suggests to contemporary science and philosophy that this assumed general form of time ought to be understood as universal and yet concrete, i.e. as heterogenous and locally singular rather than as homogenous and globally same. Molar processes as we know them from the solidification of matter in crystallization suggest we think about time neither in the terms of continuity nor instantaneity, but how then?
We will cross-read texts from Philosophy, (Astro)chemistry, Mathematics, Communications Engineering, and Cosmology in order to endow the key concepts that demarcate our field of interest with saturation and consistency. Some of these concepts are:
Elements, Molar Volumes, Molecular Bonding, Information, Fields, Frequencies, Phases, Communication Channels, Reciprocity, Ciphers, Encryption and Decryption, Crystals, Cryptography, Quadruple Structures or Double Articulations, Planes and Strata, Faces, Diffraction, Non-Organicity.
We meet every monday for a full day, 9.30 am to 12.00 and 1.00 to 5.30 pm.
We will structure the theory day in 3 thematic blocks à approx. 2 hours each.
In addition, you are expected to work every weekday a minimum of 2h for preparing readings, small presentations, etc. The assignments are communicated week by week, as we go on.
Naturally, the program is subject to possible adaptation and change.
Natural Communication (MAS Guest Seminar by Elias Zafiris, April 28-30 2015)
APRIL 28-30 2015
Tuesday April 28th 2-6 pm
Wednesday April 29th 2-6 pm
Thursday April 30th 2-6 pm
ETHZ
DARCH CAAD
BUILDING HPZ, FLOOR F
EVERYONE WELCOME TO ATTEND ! write an email to buehlmann (at) arch.ethz.ch
Natural Communication
Information theoretic processes of communication take place via a bidirectional modelling scheme of encoding and decoding information from one domain to another. Contrary to common belief, these communicative procedures of information transfer are not direct but always follow a particular type of a dynamically adjusted design suited to both, the characteristics of the involved domains, and the nature of the information to be exchanged or transferred. The basic attributes of this natural design may be summarized as follows: I. Information flow follows a circulatory pattern between the involved domains consisting of cycles of encoding/decoding or encrypting/decrypting information. II. The involved domains stand in a reciprocal relation to each other with respect to the circulation pattern of information flow. III. The information flow can be metaphorically thought of in terms of elastic cords binding the involved domains by means of a network of bidirectional connections. IV. Reciprocal encoding and decoding processes making up the information flow are of a categorial nature and always effected through universals of a non-spatiotemporal nature. V. The naturality of the design, meaning the non-dependence on ad hoc choices and conventions, is characterized by concrete invariance properties of the information flow itself in relation to the involved domains. VI. The pattern of information flow is not rigid. On the contrary, it is dynamically adjustable within the limits imposed by the invariance properties and can be metaphorically thought of in terms of processes of crystallization of the information flow. The above briefly described attributes characterizing the natural design of information exchange or transformation processes of communication can be formulated in a precise manner mathematically in the language of category theory, and in particular, in terms of the notion of categorial adjunctions. Category theory was born out of the discipline of homological algebra, which in turn traces its genesis in the merging of universal algebra with topology. The decisive moment in the history of mathematics, when for the first time the design pattern of information flow between different domains has been abstracted in structural and not merely arithmetical terms, has been the conception of Galois theory. The realization that the natural design of a communicative information flow follows universal rules required the recently conceived higher abstractions of category theory for a precise formulation. Notwithstanding this fact, this universal bidirectional modeling schema of information encoding/decoding is barely known outside of the arena of pure technical mathematics, and this is something that has to be remedied urgently in the very near future. The purpose of this course is precisely the familiarization and conceptual understanding of the basic notions involved in this schema.
Spatial Manifold, Manifold Times (MAS workshop November 24/25 2014 by Emiddio Vasquez)
Bernhard Riemann’s 1854 habilitation thesis ‘On the hypotheses which lie at the foundation of geometry’ ought to be read as it was originally intended to: a critique of measure pointing at mathematicians and philosophers alike. Taking Deleuze’s appropriation of Riemann as a point of departure in Bergsonism, it is important to highlight and further develop the critical implications of Riemannian manifolds.
Considering that manifolds were not intended to be simply mathematical structures – in fact, Riemann exemplified them in the physical domain as colour, and in general, qualities – the concept of manifolds can and should be extended to different fields. The second day of our meeting will be an attempt to explore those possibilities by looking at dynamic systems (self-organized criticality) and physical processes (1/f noise). The aim will be to conceive of time as fluctuating, but without trivializing it. Perhaps if we were to experience it directly, we would have to rescale it to our own time metric, which points for exploration towards the infinitesimal. This could be what Curtis Roads means by ‘microsound’, which is why sound will be manipulated, stretched and granulated. We will problematize Fourier series and reflect on the idealized sine tone.
These shifts from different fields will necessarily raise questions concerning the translatability of structures (i.e. in parametric design) and their creative potential in transdisciplinary practices.
Emiddio Vasquez holds an MAS degree in Mathematical Physics and Philosophy.
Part 1
Part 2
Part 3
Here is Emiddio’s lecture from 2013: Riemannian Manifolds and their role in Deleuze and Guattari‘s philosophy
Within the domain of the sun’s inverse Or: where are we when we think computationally?
The open seminar is co-curated by Vera Bühlmann (ETH Zurich) and Erin K Stapleton (Kingston University, London). It is a laboratory for applied virtuality event, organized by the Chair for Computer Aided Architectural Design at the Swiss Federal Institute of Technology Zurich, ETH. The event will take place at Wirtschaft Neumarkt and Cabaret Voltaire, Zurich, on November 26th – 27th 2014. There will be papers delivered by invited speakers, and each will be 30 minutes long, with a 30 minutes discussion to follow.
Wednesday November 26th at Wirtschaft Neumarkt
Thursday November 27th at Cabaret Voltaire
The space for audience is limited, but the event is open for anyone who is interested. Please refer to the event’s website for registration.
Download the poster here: sunsinversePosterA2-2.
Abstract
In this seminar, we focus on the role of speculation in theory and philosophy in a historical manner, yet with clear inclinations to our contemporary present and the currently exploding interest in speculative methods in academic practice and research contexts. In order to expand our positions beyond strictly pragmatist considerations, we locate the role of speculation between two schematically accentuated poles: the first approaches the role of speculation in the (arguably inevitable) dogmatisms of anthropological cosmopolitics, operating within domains of sufficient reason; while the second approaches the role of speculation within a rational cosmology, where it (arguably inevitably) is prone to engender what Kant terms “the antinomies” of pure reason, and which he sees as inherent to all systematic approaches to the cosmos.
We will align these positions in a matrixical manner with vectors of (contemporary) characterizations of the sun written by such thinkers such as Henri Poincaré, Georges Bataille, Jean-François Lyotard, Gilles Deleuze, and Michel Serres. Based on these characterizations we will attempt to profile different understandings of economy. Two fields will serve as ‚case-situations‘ for thinking about the ‚energetic-material‘ role of the sun: (1) Artificial Photosynthesis and its consequences for how we can think about food and energy, and (2) ‚metabolizing‘ algorithms such as Google’s PageRank which introduce circuitous units for measuring the relevancy of data in a quasi-climatic sense, and the peculiar economy of data’s value the emergence of which we are presently witnessing – as an economy that is being addressed within the registers of ‚biopolitics‘ and/or ‚cognitive capitalism‘.
Speakers
Terence Blake (Agent Swarm blog)
From Inversion to Many Versions. Feyerabend’s Philosophy of Nature.
Paul Feyerabend is often associated with a destructive criticism leading to an anarchism that flouts every rule and a relativism that treats all opinions as equal. This negative stereotype is based on ignorance and rumour rather than on any real engagement with his texts. Feyerabend’s work from beginning to end turns around problems of ontology and realism, culminating in the outlines of a sophisticated form of pluralist realism. This largely unknown ontological turn taken by Feyerabend in the last decade of his life was based on four strands of argument: historical considerations, cosmological criticism, complementarity, and the primacy of democracy.
Vera Bühlmann (CAAD ETHZ)
The Creative Conservativeness of Computability
Felicity Colman (Manchester School of Art)
Transmission Materialist Informatics and Regimes of Entropy
For this seminar I explore some of the core elements for a practice of creative speculation: the concepts of energy, matter, transmission, and entropy. The practice model is that of the American artist Robert Smithson. In 1971 Smithson proposed that we should compile all the different entropies. This would provide a study of ‘entropology’ (after Claude Levi-Straus described of a post-anthropology). Smithson joked about how ‘wreckage’ is more interesting than ‘structure’ and he proposes the sun and its associated entropic regimes as a methodological process, one that is productive of material systems. As any system is itself subject to change through shifts in informational matter, any computation of a system must take into account the transmission factor, and is thus always subject to not only the entropy of its materiality but the entropic language of its sense as a description. Smithson proposes that between the absurdity of language structure and the virtuality of the 4th dimension ‘a device for unlimited speculation’ is located.
Ref: Felicity Colman. 2006. “Affective entropy.” Angelaki (11, 1) http://www.tandfonline.com/doi/abs/10.1080/09697250600798060#.VHSbPLywHhZ
Ludger Hovestadt (CAAD ETHZ)
Continuing the Modernist Legacy by Reverse Engineering
Jorge Orozco (CAAD ETHZ)
How the PageRank Algorithm Operates Technically
Matteo Pasquinelli (Philosopher, Berlin)
The Computation of Cognitive Capital
Since the times of Smith, Ricardo and Marx, capital is clearly a form of computation. The apparatuses of capital describe by themselves a complex mathematical system. After WWII the numeric dimension of capital has been coupled with the numeric dimension of cybernetics and computing machines, then gradually subsuming also upcoming forms of augmented intelligence. Capital, as a form of accounting, as a form of exterior mnemonic technique, is in itself a form of trans-human intelligence. Cognitive capitalism, specifically, on the basis of all its numeric procedures, from layman’s accounting to sophisticated algotrading, from immaterial labour to scientific research, is an institution of computation. (unfortunately, there is no video available for this lecture).
Johannes Paul Raether (artist, Berlin)
Augmented Embodiments and Identitecture in Capitalist Society
Through the filter of his multiple Potential Identities, the Identitect Johannes Paul Raether will give an introduction to his shizzo-realist avatars and psycho-active institutions that have been crystalizing in evolving experimental framework they call Identitecture. Appearances of figures such as Protektorama, the SmartphoneSangoma and WorldWideWitch as well as Transformella, ReproRevolutionary of the Ovulo-factories, will be presented and reflected, while instances, sites and practices within these Appearances will be discussed for their respective terms and methods. The aim of self-made conceptualisations of proto-academic terms such as ‘Beautified Intervention’, ‘Immersive socio-real Environment’ and ‘Augmented Embodiments’ will be to continuously construct a partial and situated, yet evolving planetary model of identity production. The lecture aims to show vectors of how to dissect from the genre of political art and attempt to travel towards a potential framing of what psycho-realist artistic practice in a capitalist society could entail.
David Schildberger (CAAD ETHZ / ZHAW Wädenswil (Beverage Design))
The Principle of Artificial Photosynthesis Generalized
For most of us it seems to be clear – as we missed the tempting opportunity to live in a lap of luxury, a paradisiacal state, we agrounded and assented to have only one nature to live in, therefore we necessarily should take care of it as its resources are finite and an escape seemed not to be worth considering yet. In General Photosynthesis supplies all of the organic compounds and most of the energy necessary for life within nature. This is a fact. A function. One dimension. But new pathways might offer opportunities for further speculations. As a chemical process photosynthesis is in principal electrical – flows of energy – affections of photons and electrons. Semiconductors offer the possibility to act as catalysts and allow to hack and dope existing systems – working on all kind of atomic substrates hence forming molecular compounds (e.g. carbohydrates). Intellectual ability as its fundamental source – opening up new worlds of food and energy flows – starters which make us eager to do more with it – from necessity to luxury. Play around. Imagine. Taste. Affections for our senses – accompanied by evoked shifts of how food is administered and its impacts on the current cosmopolitical civil engineering.
Erin K Stapleton (Kingston University London)
Digital Images, Heterogeneity, Expenditure, Destruction
Elias Zafiris (University Athens and University Budapest)
Natural Computation as Cryptography: A Spectrum View on Number Theory
Natural computation is based on the simulation of natural processes making use of natural materials, for example the use of synthesized macromolecules as computational units in DNA nanotechnology. I will explore the idea that the formation of these computational units requires an encoding-decoding procedure, which can be described in general as a two-way mapping from the molecular scale to the macroscopic scale. Considering these computational units as codes we need algorithms to operate with them. In turn, this can be described as an enciphering-deciphering procedure operating by means of a key. This brings us to the domain of natural cryptogrpahy. In this context, I will discuss the role of number theory in unraveling natural keys for computation”. (unfortunately, there is no video available of this lecture).
************************************************************************************
Précis
The decades of our immediate past throughout the 20th century are characterized by the perhaps unprecedented air of an apparently post-cosmological era, where secular politics builds on scientific knowledge rather than on theological authority, and where questions of moral value are framed ‘critically’ within the various cosmopolitical (rather than cosmological) orders (humanist, neo-humanist as well as post-humanist) that seek to render the very touchstone of enlightenment philosophy fit for our contemporary time: namely to find an alternative to cosmology as the domain from which the nature of thought might be deduced. Because, as Kant realized, cosmology – if pursued undogmatically, and that meant for him purely rationally – is prone to produce paralogisms, and hence poses irresolvable antinomies for a modern, secular and non-tradition-based philosophy at large. Like this, enlightenment philosophy distances itself from theological narratives of time, involving ideas of apocalypse, the fall, and accounts of salvation.
But critical distance doesn’t mean having done with. The questions of justice, decision and judgment, in short the problem of good and evil have not, of course, disappeared. They remain entangled with the relation between the concrete power of cognition and thought’s abstract character.
Despite the general appreciation of Kant’s insight and his call for moderateness and critical distance vis-à-vis a cosmic ‚truth,’ scientists from the mid-19th century on were bound once again by the idea of an apocalypse purported by insights into the cosmos itself, now arising from within the Laws of Thermodynamics. Considered as a scientific Universal Law (not cosmological in the pre-critical sense!), the second law seemed to inevitably announce a „heat death“ of the universe (Helmholz), which would be the „end of all physical phenomena“ (Rankine). Thus, once again, there was a big interest in waving Kant’s cautions against a scientific cosmology from the table, and in setting out instead to formulate one – in a sense that could now claim to be based on an approximative, empirical and experimental basis, rather than on abstract rationalist speculation alone. Henri Poincaré’s „Leçons sur les Hypothèse Cosmogonique“ cover, and put into a more Kantian-like, moderately-minded perspective, many of the then prevalent ideas about ‘entropic creation’ and ‘cosmic evolution’ that began to rise with the development of thermodynamics. Helge S Kragh’s book Entropic Creation. Religious Contexts of Thermodynamics and Cosmology (Ashgate 2008) gives an informative overview.
In the post Second World War era, there is another line of intellectuals who, in apparently untimely manner, stick to cosmical categories in one way or another: (1) Among the most prominent of these is Georges Bataille, who, writing directly before, during and after the war used political, anthropological and religious examples to illustrate how the operations of what he understood as the equilibrium of a „general economy of energy“ is destabilized by capitalist focus on accumulation, and what he designated as a dangerous deferral of the expenditure of excess energy. Bataille draws attention to solar abundance as a systemic model that focuses on the expenditure of the excess energy beyond utilitarian need and survival. (2) Then we have Jean-François Lyotard, for whom the question of the condition of postmodern knowledge is tied up with the foreseeable cosmic extinction of all embodied forms of life – “the sole serious question to face humanity today“ (Lyotard, The Inhuman, 9). By asking „Can Thought Go On Without A Body“, he relates the interest in liberating thought, through its aspired mechanization, from actual embodied life forms and their diversities (as it motivates, at least as a not-so-insignificant factor, programs of Artificial Intelligence) to the apocalyptic future as foreseen by the astroscientific theories that confront humanity with the finitude of the solar ‘reservoir’. For him, the sun’s approximating death is the sole important question because he sees it as organizing the vectors that are followed by capitalist techno-science, and that revolve around the question: how can science defy the upcoming cosmic catastrophe? Lyotard sees as the apparent goal of cosmopolitical capitalism the emancipation of intelligence from life. Today, as Lockheed announces their break-through in controlling processes of nuclear fusion (which would factually bring the production of an artificial sun on earth into proximate reach), and apparently plan to build fusion reactors (www.spiegel.de, October 17th 2014), Lyotard’s perceptive observation can be extended from Artificial Intelligence to Artificial Life and Robotic Forms of Embodiment. (3) Michel Serres is a third contemporary intellectual for whom a characterization of the sun is central to philosophy. The sun, for him, is “the real, ultimate capital” (Serres, The Parasite, 173). And it is as capital, he dares to think, that the sun can count as the origin of all things insofar as they can be reasoned in the terms of a systematical account. The sun is the realm of general equivalence, and as such it is outside the true and outside the false, “it indicates contents that are jokers” (The Parasite, 163). This introduces a turn in how to think about rationality and the nature of thought. Our challenge today is not to think the Universe’s dynamics as revolving around an axis of cosmic time, he maintains. Rather, it is to reason about a universe that expands. The nuclear particle perspective is only one aspect of the universe’s physics. Of equal importance is the wave dimension of the photons radiated from the fusion of nuclei. Photons are parceled quanta of light – light characterized by different frequencies – from which electrons can bind with atomic nuclei and accrete into the chemical elements that make up, materially, all the bodies in a solar system. If a cosmic order, rather than an anthropological one, is to inform philosophy, it must take both aspects into account. In consequence, where philosophy has hitherto been thinking in descriptive terms of relations of equivalence, we ought to think in contractual terms of relations of equipollence – being same in the mediate aspects of force, power or validity (the symbolic character of Being), rather than immediately in form or essence (Being). This relation of equipollence, he maintains, is different from the relation of equivalence in that it cannot, by principle, be stated in any firm and foundational manner – it needs to be articulated contractually, symbolically, in discrete terms that formalize the reciprocal bonds of all that factors in the relations that can be reasoned philosophically, and investigated scientifically, between the real (world) and the rational (cosmos). We can think about relations of equipollence, he suggests, as a contract with nature (The Natural Contract, 1980).
With Serres’ position, the vector of computability’s inevitable alliance with the fulfillment of cosmopolitical capitalism, as exposed by Lyotard, can be seen, to a certain extent, as disempowered – simply because the notion of ‚signature‘ interrupts the continuous transformability of functional conclusiveness. The notion of ‚signature‘ appears in Serres book rather abruptly, and given that Serres introduces the idea of a Natural Contract as an answer to the concerns of the planet’s climate and its health, speaking of ‚signatures‘ is likely to raise associations to Paracelsus’s 16th century pharmaceutics, developed out of his doctrine of a ‚harmony between the macro- and the microcosmos‘. But more urgently, perhaps, we can relate it to the technologies of communication and information that constitute science today as techno-science. Here, ‚signatures‘ occupy a central role in where and how we compute: the transmission of messages is organized by protocols, on the level of hardware as well as on that of software. Protocols establish encrypted keys that work with so-called ‚signatures‘ used to specify abstract data types algebraically. In communications technology, the notion of ‘signature’ exposes that all inferences of a computation-based logics are actually based on symbolical contracts.
In the seminar we want to discuss the problems that are implicated within the question ‘Where are we when we think computationally?’. We would like to invite you to speculate with us: what would it mean to ask about the Inverse of the Sun?*
* In mathematics, an inverse problem is a problem considered as indetermined. Or, pertaining to the language of logics, with inverse problems it is the domain of an argument (its established reason or firm ground, as we might say metaphorically) that is treated as unsettled, while the range of an argument (its manifestation in cases, like the symptoms of an illness) can be registered, labeled and documented.
PhD Colloquy Winter 2014/15 || An Untimely Nature of Communication: The Cyphered Reality of Channels
….and: The Birth of Geometry in Encryption and Deciphering–Towards a Physics of Communication.
“Bacteria, fungus, whale, sequoia, we do not know any life of which we cannot say that it emits information, receives it, stores it and processes it. Four universal rules, so unanimous that, by them, we are tempted to define life but are unable to do so, because of the following counterexamples. Crystal, indeed, rock, sea, planet, star, galaxy: we know no inert thing of which we cannot say that it emits, receives, stores and processes information. Four universal rules, so uniform that we are tempted to define anything in the world by them, but are unable to do so because of the following counterexamples. Individuals, but also families, farms, villages, cities, nations, we do not know any human, alone or in groups, of which we cannot say that it emits, receives, stores and processes information.“
– Michel Serres (2014)
Technology presumes given measures to operate steadily, mechanically, reliably and unbiased. In other words, technology can work smoothly when metrics is not problematical. Yet there is a reciprocal germination of our ability and sophistication in measuring: technical devices allow for new practices, and hence are capable of opening up new understandings of matter, time and space, new modes of inhabiting the world, new knowledge, new forms of organization, new insights, new interests, altered cultural values – new manners of measuring and new devices. This relation between the (collective) subject of a lived praxis (civilization) and the (collective) subject of a functioning operator (technology) constitutes the core of media studies: it is through this relationally within collectivities (rather than individual subjects) that media are distinguished from technical instruments.
In this PhD colloquy we will read a selection of texts by the classical protagonists of the field: Eric Havelock, Harold Innis, Marshall McLuhan, Friedrich Kittler. Thereby, we will (1) sharpen a particular perspective that foregrounds the role of „technical channels“ and their quasi-physical reality in techniques of algebraic symbolic encryption, and (2) we will pursue, question, discuss and elaborate this perspective in relation to the fact that from today’s point of view, we are dealing not only with McLuhan’s unsettling observation about the non-neutrality of the communication channels, namely that The Medium is the Message; a tip more abstract now, e.g. in peer-to-peer filesharing, we have the situation that Each Message Constitutes a Channel.
Reading assignments are roughly 60 pages per week.
The colloquy will take place on Tuesday mornings from 9.30 am to 11 am at CAAD ETHZ.
Meetings are scheduled from Sept. 30th until the end of February 2015
(we will reserve flexibility to coordinate with upcoming events like the Seminarweek etc.)
First meeting on September 30th 9.30 am at CAAD ETHZ.
Please read the articles headed under „introduction“ for this meeting.
LITERATURE
(in the order to be followed in our meetings)
Introduction: Towards a Physics of Communication
Michel Serres, „Information and Thinking“ (10 pages)
Wolfgang Ernst, „Experimenting with Measuring Media“ (30 pages)
Mark Hansen, „Speculative Phenomenology of Microtemporal Operations“ (20 pages)
Vera Bühlmann, „Generic Mediality” (10 pages)
[all PDFs on the server]
Image, Form, and the Preservation of the Articulable
Eric Havelock, Preface to Plato (340 pages) [PDF on the server]
Economy and Politics with Preserved Articulations
Harold Innis, Empire and Communication (219 pages)
[http://www.gutenberg.ca/ebooks/innis-empire/innis-empire-00-h.html]
The Distinguished Origination of Artifacts in the Cultivation of Mediacy
Marshall McLuhan, Understanding Media (300 pages) [PDF on the server]
Media Archaeology in the Electromagnetic Materiality of Imagination
Friedrich Kittler, Film Grammophone Typewriter (260 pages) [PDF on the server]
Rooting Inside of Time: An Impersonal and A-subjective Principle of Reason
Wolfgang Ernst, Digital Memory and the Archive (260 pages) [PDF on the server]
Future Cities Laboratory FCL at NUS & ETH in Singapore
The Future Cities Laboratory (FCL) is a transdisciplinary research programme focused on sustainable urbanisation on different scales in a global perspective, laying the foundation for a new form of urban studies programme. FCL is co-initiated by the ETH departments of Architecture (DArch) and Civil Engineering (DBaug). It is the first research programme of the Singapore-ETH Centre for Global Environmental Sustainability (SEC). It is home to a community of over 100 PhD, postdoctoral and Professorial researchers working on diverse themes related to future cities and environmental sustainability.
applied virtuality book series | Die Nachricht, ein Medium – Generische Medialität, städtische Architektonik
Was würde es bedeuten, die architektonische Frage darüber, wie das Wissensmögliche seine Genese, Fügung und Gliederung findet, unter genuin städtischen Vorzeichen neu zu formulieren? Medialität müsste darin eine zentrale Rolle spielen. Sie wäre der Inbegriff all dessen, was in seinem Bedeutungsgehalt vermittelt, erhalten, entwickelt und erzeugt werden kann. Mit der Inversion von Marshall McLuhans berühmten Dictum „Das Medium ist die Nachricht“ geht es Vera Bühlmann darum, einen generischen Begriff von Medialität zu konturieren, welcher die selbstbezügliche und selbst erzeugende Wendigkeit medialer Instrumentalität zu adressieren vermag. In diesem Sinn wird Medialität als Element verteilter, diskontinuierlicher und sich gegenseitig herausfordernder wie hervorbringender Möglichkeiten konturiert, die einer symbolischen Natur des Allgemeinen entspringen. Dieser Vorstellung spürt die Autorin hier kulturgeschichtlich und gegenwartsdiagnostisch nach.
Ein Pixel.
Zahlen und Buchstaben bespielen die Atomistik der Einbildungskraft.
Medien sind generische Gestalten.
Eine Gestalt ist eine Verbundenheit.
Ein Raster ist eine generische Verbundenheit.
Ein Punkt ist eine Verbindlichkeit.
Ein infinitesimaler Punkt verbindet einen kontinuierlichen Inhalt.
Eine Form ist eine Regelmäßigkeit.
Eine symbolische Form stellt Medialität bereit.
Medialität ist eine Regelmäßigkeit im Verbindlichen.
Ein Vertrag.
********* Buy at amazon************
INHALTSVERZEICHNIS
Précis
Einleitung
Teil 1 Virtualität und Medialität
1. Zur Genealogie des Medialen
Medien als archimedischer Punkt unserer Weltverhältnisse
Virtualität und die Frage nach dem Konstitutivum von Medien
2. Informatisierung als kulturgeschichtliche Wendezone
Zur Notwendigkeit einer Radikalisierung des kritischen Programms
Das Problem der Rahmung eines erweiterten Prinzips der Verfügbarkeit
Grenzen einer phänomenologischen Ontologie
3. Aufs Neue Die Frage nach der Referenzialität von Zeichen
Die Begründbarkeit der Information im Element des Symbolischen
Die theoretische Neugierde
Räumliches Denken – Codieren eines Außen nach Übereinkunft
Zusammenfassung
Teil 2 Formen und Strukturen von Integrabilität
1. Virtualität und Konstruktionsform
2. Zum Topos der Begrenzung
Ein Planet namens »Terra« oder Der Mythos des Firmaments im Moment der Vermeerung
»Legere in libro naturae« oder Von der Scheidung der Welt in eine Welt der Werte und eine Welt der Fakten
»Relationenontologie« oder Die neuzeitliche Integration von Bewegung in die Art und Weise, Verhältnisse zu bestimmen
Die Relativierung von Stetigkeit als Voraussetzung oder Vom deterritorialisierten Denken bis zur rekombinanten Synthese
3. Funktion, Sinn und Form
»Funktion« – Geschichte und Verwendung als Theorie und Technik
Imagination und Methode oder Das Ende der Repräsentation durch die Vorstellung.
Die Frage nach dem Sinn oder Das Problem des Anfangs
Die Idee als »Differential« des Denkens oder Zum Verhältnis von Struktur und Genese im Sprachspiel des Virtuellen
Das »Informelle« oder Zum Konzept der Ähnlichkeit als Medium
Zusammenfassung
Teil 3 Virtualisierung von Dialektik: Zum Verhältnis zwischen Theorie und Synthese
1. Die synthetisierende Analyse im Paradigma der Netze
Die Medialität und die Unbestimmtheitsdimensionen des Technischen
Zur Geschichte und Metaphorik des Begriffs der Spur
2. Das Modell und die Simulation: das kontingente Konkrete
Die Simulation: Ersatzoffenbarung oder epistemisches Werkzeug?
Modelle: Mathematical fictions?
Simulacrum, Abbild, und das Herkommen von Templates in fantastisch-antizipierbaren Genealogien
3. System, Element, Serie. Inversion mimetischer Traditionslinien
Die Integrität und Existenzweise technischer Objekte
Der Individuationsprozess technischer Entitäten
»Konvergenz« – Grenzen des Konzepts im Sprachspiel des linguistischen Strukturalismus
Coda Ein generischer Begriff von Medialität
*************************************************
AUSZÜGE
Précis
Robert Musil beginnt seinen Roman Der Mann ohne Eigenschaften mit der Überschrift: »Eine Art Einleitung. Woraus bemerkenswerter Weise nichts hervorgeht.«[1] Das Folgende handelt in eindrücklichster Weise von einer Welt, in der sämtliche Eigenschaften generisch sind: eine Welt allgemeiner Natur, in der jedes Ausgezeichnete nur ausgezeichnete Regelmäßigkeit ist, die auf zahlreiche Arten in logistisch ausbalancierten und global-uniformen Verhältnissen variiert: »Über dem Atlantik befand sich ein barometrisches Minimum; es wanderte ostwärts, einem über Rußland lagernden Maximum zu, und verriet noch nicht die Neigung, diesem nördlich auszuweichen. Die Isothermen und Isotheren taten ihre Schuldigkeit. Die Lufttemperatur stand in einem ordnungsgemäßen Verhältnis zur mittleren Jahrestemperatur, zur Temperatur des kältesten wie des wärmsten Monats und zur aperiodischen monatlichen Temperaturschwankung.«[2] Und so geht es weiter, über mehr als zweitausend Seiten. Das Fantastische an Musils Werk ist nun aber, dass eine solche Schilderung der Dinge in ihrer prinzipiellen Gleichwertigkeit ein absurdes Projekt darstellt, welches den Erzähler in die eigenartige Situation drängt, nicht primär Episoden aneinanderzureihen und ineinander zu verschachteln, sondern Dinge in ihrer generischen Struktur mit Merkmalen von Regelmäßigkeiten auszustatten, sodass jedes im Gesamtgeschehen des Romans interessant aufgeladen, angereichert und von Aktualität durchströmt wird. Doch alles, was geschieht, geschieht nicht über das Anstoßen von Veränderungen in Situationen, über welche eines zum nächsten führt und jegliches, was ausgezeichnet ist, voller Implikationen wäre. Stattdessen geschieht alles in Musils Roman unmotiviert oder zumindest ohne zwingenden Grund, sondern einzig durch die Qualifizierung der Aktualität eines Geschehens. Von diesem Geschehen sprechen wir vielleicht am treffendsten als einem, welches vom Erzähler der Vorstellung eines Großen im Ganzen überantwortet worden ist und von dem dieser, in sachlichem Tonfall, lediglich Nachricht erstattet.
Wir möchten diese Gedanken als einleitendes Bild verwenden, um damit einen bestimmten Vorstellungsraum zum Nachdenken über Kommunikation und Medialität zu evozieren. Wir wollen Information als natürliches Element begreifen und jenem musilschen Qualifizieren von Aktualität, welches darin zum Ausdruck kommt, eine technische Entsprechung zuweisen: das Codieren von elektrischem Strom zur Übermittlung von Signalen. In einer solchen Natur der Allgemeinheit, so die Behauptung, hängt alles – Musils »Großes im Ganzen« – von der Art und Weise ab, wie erzählt wird. Und so wollen wir, von seiner Erzählhaltung ausgehend, die Dinge in ihrer Generik auszustatten und die energetischen Stromkreise zu modulieren, welche sie durchströmen, eine Alternative zum vorherrschenden kybernetischen Bild des Kommunizierenden als Steuermann in den stürmischen Ozeanen der Informationsflut sehen. Während der kybernetische Steuermann sich jegliche Kommunikation im Hinblick auf die Generalität in der Form der abbildbaren Inhalte hin anschaut, so sieht der Erzähler darin eine symbolisch-stoffliche Lösung allgemeiner Intelligibilität und Sensibilität, die er – ähnlich wie ein Chemiker – weiter aufzulösen, anzureichern und zu sättigen sucht.
Fassen wir die Natur des Allgemeinen auf diese Weise als reziprok bestimmbares Verhältnis – zwischen erstens der Vorstellung von einem Großen im Ganzen und zweitens dem Erzählen-Können von allgemeinen Dingen im Horizont dieser Vorstellung des Großen im Ganzen – so ist klar, dass jene Natur des Allgemeinen zwar jedem Erzählenden und jeder Vorstellung zukommt, aber nicht in gleicher Weise. Anders formuliert: Diese Natur des Allgemeinen ist, trotz ihrer reinen Generik, nicht dem Universellen äquivalent. Bezüglich der Vorstellung von Universalität muss vielmehr gelten, dass jegliches generisch Artikulierte, mag es mit noch so viel Erzählvermögen ausgedrückt sein, das Universelle nie erschöpfend zum Ausdruck zu bringen vermag. In diesen Symmetriekonstellationen wird es möglich, den Charakter dieses Allgemeinen zwar als generisch – also ohne Rückgriff auf eine Klassifikation von Natur – zu bestimmen, aber ohne ihn zirkulär und tautologisch begründen zu müssen: Wir können uns das Generische als Inseln unterschiedlich verteilter und unterschiedlich gesättigter Dichten von Allgemeinheit im ozeanischen Element des darin anklingenden Universellen vorstellen.
Mit einem technisch formulierten Bildvergleich können wir präziser fassen, worum es bei einer solchen Natur des Allgemeinen geht: Eine Formulierung, die nichts außer generisch ist, ist eine Gleichung. Doch müssen wir diese nicht notwendigerweise in ihrer Immanenz als Abbildverhältnis fassen, wenn wir das Allgemeine, das darin zum Ausdruck kommt, nicht mit dem Universellen, für das eine Gleichung steht, als deckungsgleich erachten. Halten wir diesen Unterschied zwischen dem Allgemeinen und dem Universellen offen, so können wir vielmehr von einem Sättigungsverhältnis des allgemein Formulierten an Universalitätsgehalt sprechen, anstatt von einemEntsprechungsverhältnis des allgemein Formulierten mit universeller Form.
Das Interesse dieses Buches besteht darin, den generischen Charakter einer so gedachten Allgemeinheit zu erwägen. Um diesen Charakter in aller Offenheit in den Blick zu bekommen, wollen wir die beiden reziprok bestimmbaren Pole anschaulicher fassen, um sie besser handhaben zu können: Wir wollen die Vorstellung von einem »Großen im Ganzen« als das Städtische erachten und jenes »Erzählen-Können von Dingen in ihrer Allgemeinheit« als das Mediale. Für beides suchen wir einen generischen Begriff zu formulieren, das heißt einen Begriff von möglichst vermögender Allgemeinheit. Damit diese Suche gleichzeitig in einer offenen Weise gelingen kann, versuchen wir, bei dieser reziproken Bestimmung der beiden Pole vom Standpunkt der Prinzipien ihrer jeweiligen Genese auszugehen.
Damit schlagen wir also vor, Medien nahezu gänzlich aus ihrem epistemologischen Spannungsfeld zu befreien, dessen Schlüssigkeit von einem sich gegenseitig ausschließenden Verhältnis zwischen Natur und Kultur abhängt. Es ist klar, dass wir uns mit dem, was wir darlegen möchten, nicht in einem Element hinreichenden Grundes und notwendiger Schussfolgerungen bewegen können. Vielmehr finden wir uns in einer an Gründen überreichen und üppigen Gegend wieder, welche alle städtischen Dramaturgien jeglicher kosmologischer Ordnungsvorstellungen auf ebenso diskrete wie disparse Weise versammelt. Eine solche Gegend ist nicht nur großzügig, sondern auch gebieterisch, in einer eigenartigen Weise, deren besseres Verständnis von einem adäquaten Begriff von Medialität – im Sinne von Marshall McLuhan dasjenige, was Maßgabe modulierbar macht – abzuhängen scheint. Um einen solchen Begriff von Medialität kreist unser Interesse im Folgenden.
Die Teile eins bis drei des vorliegenden Buches sind der Kern meiner Promotion, die 2009 unter dem TitelInhabiting Media. Annäherungen an Herkünfte und Topoi medialer Architektonik am Medienwissenschaftlichen Institut der Universität Basel, Schweiz, vorgelegt wurde. Ich möchte an dieser Stelle Georg Christoph Tholen als Hauptbegutachter und Ludger Hovestadt als Zweitbegutachter sehr herzlich für ihre Unterstützung danken wie auch für ihr kritisches Wohlwollen, mit dem sie meine Arbeit begleitet haben. Diese Teile wurden für die vorliegende Publikation nur stilistisch etwas überarbeitet. Sie wurden jedoch mit der Einleitung und der Coda aus rückblickender Perspektive, fünf Jahre später, neu gerahmt.
Gewidmet sei dieses Buch Klaus Wassermann.
Zürich, März 2014
Einleitung
»Die Idee der Ordnung durch Fluktuation ist nicht nur eine neue Idee, sie ist die Neuigkeit selbst, ihre Definition.«[3]
Als Sinnbild sowohl für die Vorstellung eines Großen im Ganzen wie auch als Brutstätte von Erzählvermögen in all seinen Ausgestaltungen gilt seit jeher das Städtische. Technik und Kommunikation spielen für die Entstehung des Städtischen eine konstitutive Rolle – insbesondere im Hinblick darauf, dass das Städtische, aus der Perspektive von Kulturtheorie und Geistesgeschichte wie auch aus jener der Naturwissenschaft, gemeinhin als singuläres Phänomen gilt: Seine Herkunft lässt sich weder auf biologische noch auf physikalische Gesetzmäßigkeiten zurückführen, sondern höchstens auf kosmologische oder anthropologische. Und das heißt immer auch auf eine symbolisierende und aus dem Unendlichen des Universums herauszuentziffernde »Gesetzmäßigkeit«. Genau mit diesem singulären Status ist eine »Natur« des Städtischen ebenfalls mit einer »Natur« von Technik, Kunst und Sprache verwandt, Natur im Sinn von »Prinzip der Genese«. Angesichts der Schwierigkeiten – nämlich der Unvergleichlichkeit dieser Phänomene – war ein Nachdenken über das Städtische (wie jenes über Technik, Kunst und Sprache) wohl immer schon mit einer spirituellen Dimension und ihrer Symbolisierung verbunden: Galten die gebauten und eingerichteten Ordnungen des Städtischen, um eine gegenseitige soziale Abhängigkeit am besten zu gliedern, doch seit jeher als hiesiges Abbild einer jenseitigen »Richtigkeit«: etwa im Persischen Reich als Metropolis und Sitz des Kaisers, Mutterstadt der imperialen Kolonien, die Stadt als Stätte des Gesetzes und Wohnort der Götter und ihrer Kinder bei Platon, die kosmische Stadt als Abbild des Universums bei den Stoikern, als Stadt Gottes bei Augustinus, um nur einige wenige Varianten zu nennen, wie die Singularität des Städtischen thematisiert worden ist.
Nun scheint im zunehmend säkularisierten Set-up moderner Nationalstaaten die Tradition, das Städtische in seiner Singularität begreifen zu wollen, gebrochen zu werden. In unserer jüngeren politischen Geschichte, zumindest in der westlichen Welt, gilt die Generalisierung des Städtischen hin zu einer allgemeinen Urbanität – nun im spirituellen Sinn als »unspezifisch« und in dieser Hinsicht als das Generische des Städtischen zu verstehen – gleichermaßen als Legitimationsgrund wie auch als anzustrebendes Ziel dafür, wie ökonomische und politische Ordnungen anhand rein formaler Institutionierung von Macht und Verantwortung verfasst werden sollen. Rein objektiv, also ohne Privilegien, die irgendwie zu rechtfertigen wären über das, was eine Situation der Überlieferung nach als eine singuläre auszeichnet, sondern generell und buchstäblich »apparatartig« formuliert (Rechtsstaat und Bürokratie, Staatsapparat und dessen ausführende Organe) gelten diese Ordnungen nun gerade wegen ihrer reinen Äußerlichkeit und Formalität als gleichzeitig »normal«, »universell« und »natürlich«. Halten wir also fest, dass das Städtische seinen Ausgang nahm als singuläres Phänomen, durch welches sich der Homo sapiens aus der unmittelbaren Abhängigkeit einer übermächtigen und allzu oft als ungastlich erlebten Naturgewalt emanzipieren konnte. Demgegenüber strebt seit der Gründung von Nationalstaaten die relativ junge Tendenz zur Generalisierung des Städtischen gerade entgegengesetzt dazu, das Städtische in seiner Generik als technisch kontrollierbare »Naturgesetzmäßigkeit« zu fassen.
In all diesen Variationen geht es um das Verhältnis des Städtischen zum Gesetzmäßigen. Was aber hat dieser Hintergrund nun mit Medien zu tun oder, noch enger gefasst, mit einem Verständnis von Nachrichten als Medien?In aller vorwegnehmenden Kürze behauptet die These, welche die vorliegende Arbeit zu erörtern und zu verfolgen sucht, dass dieses Generische des Städtischen mit einem Generischen des Medialen ergänzt werden sollte. Damit ließe sich eine symbolisch-formale Kritisierbarkeit ermöglichen, die einen drohenden Kurzschluss von Technik und Natur, Möglichkeit und Notwendigkeit vielleicht aufzubrechen vermag, ohne damit spezifische semantisch-inhaltliche Bestimmungen zu infundieren. Das Merkmal des Städtischen, haben wir gesagt, sei seine Singularität: insofern, als dass seine Genese unvergleichlich mit einem anderen Phänomen scheint und aus keiner unmittelbaren Gesetzmäßigkeit hergeleitet werden kann, ohne dass diese Gesetzmäßigkeit theologisch aufgeladen werden müsste, um als Erklärung zu dienen. Nun, dieses Merkmal lässt sich ebenso sehr der Genese des Medialen anhaften. Tun wir das, sind beide – nicht trotz, sondern in ihrer Singularität – nicht mehr unvergleichlich.
Aber ist, um einen Vergleich zu ziehen, nicht ein drittes oder sogar noch ein viertes Ingredienz erforderlich? Brauchen wir nicht zumindest ein positives Maß, wenn nicht eine elementare Proportionalität? Inwiefern sich diese Frage verneinen lässt, darum geht es bei einem philosophischen Begriff des Differentials, um den sich ein Großteil des folgenden Textes dreht.
Die Vorstellung, die wir zu entwickeln suchen, besteht in folgenden Postulaten: 1. Es gibt eine Objektivität von Nachrichten. 2. Dieser Objektivität kommt ein Sein zu, das in den Artefakten insistiert und von jedem, der es in seiner Objektivität zu adressieren sucht, nicht nur etwas gibt sondern auch etwas will. Was diese Objektivität von uns will, darüber können wir nur rätseln. 3. Die Objektivität von Nachrichten waltet in den Artefakten als das Sein von Neuigkeit und sie lässt sich, so wollen wir vorschlagen, in Analogie zur Objektivität von Energie formulieren, die in der Elektrizität waltet. Es lässt sich genauso wenig ein repräsentatives Bild von ihr zeichnen, wie dies hinsichtlich von Energie möglich ist: Man kann die Objektivität von Nachrichten weder über das Postulat einer Treue zu der darin verfassten Botschaft erwägen, noch kann man sie über die formale Integrität ihrer Vermittlungermessen. Aber man kann sie erhalten, so wie sich Energie erhalten lässt: in Sätzen, welche ein universelles Entsprechen ausdrücken.
Aber müsste die eigentlich (medien-)philosophische Frage nicht darauf gerichtet sein, was diese Objektivität von Nachrichten will? Davon wollen wir ausgehen und zugleich nahelegen, dass die »erschliessende Entzifferung« von diesem Etwas uns als die fortlaufende Genese des Städtischen gelten kann.
Zugegeben: Der Sprung, der nötig ist, um die Singularität des Medialen auf die Singularität des Städtischen zu beziehen und umgekehrt, ist ein riskanter. Aber es scheint ein lohnender Sprung zu sein, verspricht er doch das reziproke Bestimmungsverhältnis als ein prinzipiell offenes, und nicht nur, um in kombinatorischem Zahlenreichtum »faktisch Unendliches« zu denken. Allegorisch gesprochen bedeutet dies, dass Modernität nicht nur ist, sondern ist in dem sie lernt sich selbst zu sein. Wenn dieser Sprung gelingen könnte, so erlaubt er beiden Begriffen, jenem eines Generisch-Städtischen und jenem eines Generisch-Medialen, differentialphilosophisch als unbestimmt jedoch bestimmbar gefasst zu werden, um so die jeweilige Singularität, die sie verkörpern, aneinander zu entwickeln. Lassen Sie uns in aller Kürze darlegen, vor welchen Hintergrundüberlegungen dieser Sprung hier gewagt wird.
Seit McLuhans berühmtem Votum, dass Medien nicht lediglich als neutraler Mittler von Botschaften zu begreifen seien, sondern vielmehr jeweils selbst die Botschaft sind, wird Medialität in den Kulturwissenschaften und in der Medientheorie im Zusammenhang mit Maßstäblichkeit thematisiert. Wie McLuhan sich ausdrückte: »Die Botschaft jedes Mediums oder jeder Technik ist die Veränderung des Maßstabes, Tempus oder Schemas, die es der Situation des Menschen bringt.«[4] Kommunikation, so legte er nahe, bringe also »der Situation des Menschen« etwas hinzu. Nun treten mit dieser von ihm postulierten Eigentlichkeit der Medien diese aber in einen gewissen Widerspruch zu dem Städtischen in seiner generischen, als Regelkreis und Apparat gedachten dynamischen Form. Das Nachspüren dieses Widerspruchs wie auch der Möglichkeiten, wie die aktuellen medientheoretischen Positionen diesen aufzulösen oder zu überwinden versuchen, hat – in aller kritischen Offenheit – das Vorhaben in dieser Arbeit angeregt.
Wie also ließe sich dieser Widerspruch in eine mögliche Formulierung bringen? Dafür muss zuerst etwas ausgeholt werden. Eine moderne, im Sinne einer »apparatartigen« Ordnung schöpft ihre eigene Veränderbarkeit auf immanente Weise aus der Dynamik ihrer eigenen Glieder und Organe. Sie begreift die Einrichtung einer generischen Urbanität (als Vorstellung eines spirituell unspezifischen Städtischen) gleichermaßen als Begründungs- und Referenzebene wie auch, im Bestreben nach säkularisierender Expansion, als ihr eigentliches Verwirklichungsziel. Nun kann in einer solchen sich immanent speisenden, dynamischen Ordnung im eigentlichen Sinn nichts hinzukommen, sondern es kann nur umverteilt und in neue Konstellationen gebracht werden. Um diese Geschlossenheit aufzubrechen, argumentiert Serres: »Die Idee der Ordnung durch Fluktuation ist nicht nur eine neue Idee, sie ist die Neuigkeit selbst, ihre Definition.«[5] Der Erläuterung, wie dies möglich scheint, dient diese Einleitung. Wir werden mehrfach und aus verschiedenen Richtungen auf Serres’ Formulierung zurückkommen.
Nun ist das Sprachspiel um Neuigkeit bekanntermaßen in eschatologischen Diskursen verortet und lebt dort davon, dass die Chiffre »Neuigkeit« sich genau nicht definieren lässt. In der Idee einer Ordnung durch Fluktuation sieht Serres denn auch nicht eine inhaltliche Charakterisierung oder Bezeichnung von Neuigkeit, sondern ihr eigentliches »Sein« als körperlich gewordene »Chiffre«. Eine moderne Ordnung als eine, die per Definitionem immer nur gerade jetzt, also buchstäblich immer »modern« ist, und in gleicher Form nie Bestand haben soll, gilt Serres als säkularisiert-symbolische Fassung von »Neuigkeit«, sofern sie buchstäblich nichts repräsentiert außer sich selbst – und dies nicht in referenzieller Weise, sondern in einer nur der abstrakten Symbolizität der Chiffreeigenen, weil sich selbst erzeugenden »Unmittelbarkeit«: nämlich operativ im Sinn von indefinit und maßgebend. In einer solchermaßen modernen Ordnung, als säkularisierte Form des Städtischen, darf kein ständisches Sein mehr eine Rolle spielen, sondern alle Gliederungen müssen im Fluss sein und sich in und durch die Fluktuationen ihrer Gerinnungen selbstständig an die erfolgten Veränderungen anpassen. So scheint in aller Konsequenz das Politische – als so etwas wie der formale Zustand von Neuigkeit (oder besser: seine formale Zuständigkeit) – von jeglicher mit einem Heilsversprechen verbundenen Semantik befreit, und zwar genau aufgrund des Anspruchs, die Form dieses Versprechens in generischer Unverfälschtheit zu sein. Die Ordnung des Politischen muss sich nirgendwo anders, als im formalen Körper dieses Poltischen selbst, in dessen Kapazität und Können, verantworten und autorisieren. »Neuigkeit« wird damit zu einem öffentlichen politisch-ökonomischen Gut und stellt sich jedem theologischen Missbrauch zum Handel um das individuelle Seelenheil entgegen. Anders formuliert: Nur indem »Neuigkeiten« ihre Referenz auf ein individuelles Seelenheil im Jenseits verlieren, lässt sich mit ihnen das Versprechen auf eine allgemeine Erlösung im Diesseits gewinnen.
Worin besteht aber nun der Wiederspruch von McLuhans Medienbegriff zu solchen Vorstellungen? Er ergibt sich (oder er ergibt sich nicht), je nachdem, wie man diesen Status einer Ordnung begreift, die nichts anderes repräsentiert als sich selbst. Wir haben vorgeschlagen, sie in dieser Eigenart als Chiffre zu sehen: Die Chiffre gilt als symbolischer Nullpunkt und Leere, das heißt, sie bietet einen neutralen Platz für jegliche Bestimmung. Aber was bedeutet das? Heißt das zum Beispiel, dass erstens diese Neutralität ohne jegliche symbolische Determination ist, weil rein formal, oder dass zweitens sie auf jegliche Weise symbolisch determinierbar ist, weil (noch) nicht formal? Bedenken wir, dass eine Form schlichtweg als Inbegriff von Regelmäßigkeit gelten muss, so lässt sich dieser Status einer Chiffre entweder als das unverfälscht Regelmäßige bestimmen (und zwar approximativ-positiv oder differenztheoretisch via negativa) oder, prinzipiell mit gleichem Recht, als das schlichtweg Disparse (Ausbleiben von jeglicher Regelmäßigkeit).
Vor diesem Hintergrund nun zurück zu McLuhan und seiner Auffassung, dass die Botschaften, die über Medien in der Zirkulation gehalten werden, in den Formaten bestehen, welche die Medien verkörpern und nicht in den Inhalten zu suchen seien, die sie gemäß unterschiedlicher Notationssysteme verschlüsseln und in verschlüsselter Form in Umlauf halten. Genau Letzteres aber wäre ihre legitime Rolle in einer politischen Ordnung, die sich allein aus ihren immanenten Fluktuationen ergibt: Medien müssten darin transparent, neutral und rein formal sein. Selbstverständlich sieht McLuhan, dass Medien tatsächlich »Botschaften« in jenem Sinn von Transport und Transkription von Sinngehalt überbringen. Doch um sie kann es für die Perspektive einer Theorie von Medien in ihrer Allgemeinheit ihm zufolge nicht gehen. Vielmehr gelte es, darauf zu achten, dass Medien ihre eigentlichen Botschaften verkörpern und sich somit direkt als »Veränderungen des Maßstabes, Tempus oder Schemas« in die Ordnungen einbringen, in denen sie die Zirkulation von immanent-erzeugten Inhalten gewährleisten.[6] Medien fügen in der ihnen eigentlich enthaltenen Botschaft »der menschlichen Situation« etwas hinzu, und zwar das Modulieren im Umgang mit Maßgabe. Zugespitzt formuliert heißt das: Medien verkörpern formatierbar gewordene Maßstäblichkeit. So wird McLuhan allgemein von den Medienwissenschaftlern das Verdienst zugesprochen, eine Wissenschaft der Medien von »äußerlichen Zuschreibung inhaltlicher oder instrumenteller Aspekte befreit zu haben, um so nämlich erst die historisch singuläre Zäsur der technischen Medien angeben zu können, dank derer diese die ›Form des gesellschaftlichen Lebens‹ steuern«.[7]
Von dieser Zuschreibung wird die als autochthon postulierte Genese einer modernen politischen Ordnung nicht selbst infrage gestellt. Es wird lediglich den Medien eine steuernde Rolle der Lebensformen in einer solchen Ordnung zugestanden und überantwortet. Daher ist diese Entscheidung hinsichtlich des Status einer Chiffre, respektive einer Neutralität, für die sie steht, entscheidend. Gilt eine Chiffre als reine Regelmäßigkeit, als reine Formalität ohne Symbolisierungsgehalt, so loten Medien, indem sie Maßstäblichkeit nicht abstrakt-formal, sondern konkret-unmittelbar verkörpern, so etwas wie die »Unbestimmtheitsdimensionen« beim Entziffern der Chiffre aus. Dabei kann diese reine Regelmäßigkeit gemäß der Chiffrendeutung sowohl positiv als auch negativ konzipiert sein – als reine Regelmäßigkeit, die nur partiell, im Negativen, gegeben ist (etwa bei Jacques Derrida). Oder sie ist als eine gegeben, die sich fortschreitend und zunehmend entziffern und in ihrer Unverfälschtheit offenlegen lässt (etwa bei Rudolf Carnap). Letzterem entsprechend operieren Medien dann für bewusstseinsstiftende Kommunikation als öffentliche Reflexionsorgane. In schematischen Grundzügen ist damit wohl in etwa jene Vorstellung artikuliert, wie sie sich aufgeklärte Menschen von gewöhnlichen Medien machen: Der Konflikt mit der autochthonen Verfassung moderner Ordnungslogik ist hier kein systemischer, er stellt sich höchstens über individuelle Eigeninteressen (Missbrauch) ein. Doch McLuhans Kriterium zur Bestimmung von Medien im Allgemeinen fällt bei diesem Verständnis von Medien unter den Tisch: Die Modulierbarkeit von Maßgabe, welche Medien nach McLuhan verkörpern und der Situation der Menschen hinzubringen, ist hier nichts den Medien eigentliches, sondern lediglich äußerliche Zuschreibung inhaltlicher oder instrumenteller Aspekte. Wie verhält es sich mit der Deutung von Chiffren als reine und formale Regelmäßigkeit, die nur spurenhaft und negativ gegeben sein soll? Hier ergibt sich ein systemischer Konflikt insofern, als dass die Eigentlichkeit der Medien,Maßstab zu verkörpern, mit der Eigentlichkeit des Objektiven, maßgeblich zu sein, in ein kompetitives Verhältnis gerät: Dem, was als objektiv gilt, kommt eine Art kollektiv-legitimierter Subjektivität zu. Genau mit diesem subjektiven Charakter aber fordern Medien nun den rein formalen Status einer modernen Ordnung, die ihre Gestalt immer nur im Jetzt finden und in der es nichts Beständiges geben soll, unweigerlich heraus: Wenn die moderne Ordnung als rein formaler Zustand von Neuigkeit begriffen wird, wenn sie beansprucht, die Form des heilbringenden Versprechens in dessen generischer Unverfälschtheit zu sein, so impliziert dies, dass Neuigkeitennur erscheinen mögen, ohne dass ihnen selbst ein Sein zugesprochen werden dürfte: Etwas Neues darf es nur scheinbar geben, wenn die Form von Neuigkeit selbst unverfälscht bleiben soll: Das ganze Spannungsfeld zwischen Phänomen und Ding, Fiktion und Realität, Illusion und Ideologie ergibt sich daraus und kann gewissermaßen als Motor von moderner Ordnung selbst begriffen werden – als das, was sie in ihrer »apparatartigen« Form fortwährend antreibt und in Bewegung hält. Jedoch muss die Frage, wie eine solche »Formatierung« überhaupt entsteht, aus dieser Perspektive ausgeklammert bleiben.
In ihrer die Lebensformen steuernden Rolle fordern Medien also entweder den Glauben an das moderne Selbstverständnis heraus, falls die Konflikte auf missbräuchliche Eigeninteressen der Menschen zurückgeführt werden. Oder aber sie fordern die Logik dieses Selbstverständnisses heraus, indem die Bedingtheit einer modernen Ordnung (also die Formatierbarkeit, welche Medien verkörpern) nicht selbst befragt werden kann. In dieser Herausforderung der Logik solcher Ordnungsvorstellung kann man einen animierenden Motor von Modernität sehen. McLuhans Thematisieren der Formatierung von Lebensformen durch »Maßstab, Tempus oder Schema« betrifft dann die Logik des Diskurses über Regelmäßigkeit und Ordnung (Kosmos), nicht die Logik von Regelmäßigkeit und Ordnung selbst. Dieses Thema nun aber lediglich reflexiv zu wenden, in Gestalt eines nie zur Rast kommenden diskursiven Korrektivs, kann die Mechanismen dieses Korrektivs (die medial artikulierten »Maße« in Gestalt von »Formaten«) nicht selbst zum Objekt kritischer Betrachtung machen, weil sich diese Reflexivität der gleichen Mechanismen bedienen muss, um sich auszudrücken. Erich Hörl hat in seiner Studie Die heiligen Kanäle. Über die archaische Illusion der Kommunikation (2005) das Thema der Symbole, welche in der Reflexivität und Entschlüsselbarkeit von Medienformaten tatsächlich am Werk sind, erneut in seiner zentralen Bedeutung für die Medientheorie offengelegt. Im autochthonen Selbstverständnis diskursiver Ordnungen entlarvt er eine heute noch weitgehend unbedachte, jedoch breit wirksame Tendenz, in welcher die ehemals als magisch apostrophierten Kanäle der Kommunikation (von denen noch McLuhan spricht[8]) über die Nichtthematisierung dieses Magieaspekts deswegen noch lange nicht ihren Status als heilige Kanäle abgelegt hätten. Dies sei erst der Fall, so Hörl, wenn wir uns selbst nicht mehr als Teil einer »alphabetischen Kultur« begreifen, sondern als den einer »elektro-magnetischen Kultur«.[9] Die Hauptachsen seines Arguments zeichnen die symbolische Verfassung der eigentlichen Technik der Kommunikationsmedien nach und halten daran fest, dass diese Symbole, welche Elektrizität und Informationstechnik ermöglichen, weder alphabetischer noch nomenklatorischer, sondern mathematischer und physikalischer Natur sind. Während aber für die objektive (säkulare) Wissenschaft die Physik als transzendente Referenz für das Mathematische galt, so hat sich dieses Verhältnis für die Physik des Elektromagnetismus verkehrt: Für die Quantenphysik ist das Mathematische transzendente Referenz geworden. Vor diesem Hintergrund gerät durch die neue Unmittelbarkeit zwischen Mathematik und Physik das Thema derSpiritualität erneut ins Innere einer als autochthon apostrophierten, selbst bewegten und selbst erzeugendenOrdnungslogik – wenn auch nun explizit wissenschaftlich gewendet als Intellektualität. Denn von solchem Charakter, selbst bewegt eine Logik zu erzeugen, welche die mit dieser Logik erzeugte Ordnung immer wieder zu rahmen vermag, kann nur dasjenige sein, was mit dem Unendlichen verbunden ist: das Göttliche und das Mathematische. Wenn sich ein diskursives Korrektiv jedoch mit einem der beiden auf privilegierte Weise verbunden glaubt, kommt in das Zwischenspiel zwischen Maßgabe (der Chiffre) und dynamischer Modulierung von Maßstab (der Medien) der Aspekt der Anmaßung. Ein diskursives Korrektiv versteht sich selbst neutral und lediglich als animierender Motor für eine Ordnung, die rein aus dynamischer und dieser Ordnung immanenter Fluktuation hervorgeht; doch gerade in diesem Selbstverständnis gerät es über den in seiner postulierten Neutralität anmaßenden Charakter der von ihm vertretenen Intellektualität in einen korrumpierenden Widerspruch zu Modernität als Ordnung, die immer nur sich selbst ist. Anders formuliert: Eine Ordnung durch Fluktuation ist nur insofern zu kontrollieren, als dass die Autorität hinter dieser Kontrolle einen bestimmten aktuellen Status quo des Mathematischen zur transzendenten, symbolischen Ordnung erhöht und diese als »neutral« deklariert. Damit jedoch kannibalisiert eine solche symbolische Ordnung notwendigerweise die Generik der politischen Ordnung: Im Sein der Neuigkeit wird nichts Neues mehr zugelassen.
Damit sind wir bei der zweiten Weise, wie sich der Status einer Chiffre, in der Neutralität des Nullpunkts und der Leere, die sie markiert, begreifen lässt: nicht als reine Form (Regelmäßigkeit) ohne Symbolgehalt, sondern alsjeglicher Symbolgehalt ohne Form. Die operativen Symbole der Algebra zeigen diesen Charakter, sie können für jeglichen Gehalt stehen und werden in ihrer Form erst mit dem Ausarbeiten der Lösbarkeit einer Gleichung (oder eines Gleichungssystems) bestimmt. Solche »Form« jedoch kann nie wie eine geometrische Form als elementarund unmittelbar gelten, sondern ist immer symbolisch verfasst. Während es bei der Perspektive der reinen Form Neuigkeit nicht geben darf, um den Erscheinungen der reinen – buchstäblich also unendlich mächtigen – Regelmäßigkeit in differierender Weise Ausdruck geben zu können, so verhält es sich bei der Perspektive, die davon ausgeht, dass dem Symbolischen in seiner Diskretheit jede Form fehlt, gerade umgekehrt: Neuigkeit erscheint nicht, sondern ist, und zwar, weil nur die formlose Diskretheit des Symbolischen eine »Definition« von Neuigkeit ausdrücken kann, die ihren Gehalt nicht inhaltlich determinieren würde. So wollen wir die Formulierung von Serres verstehen: »Die Idee der Ordnung durch Fluktuation ist nicht nur eine neue Idee, sie ist die Neuigkeit selbst, ihre Definition.«[10] Auch diese Perspektive hält am Attribut »unendlich mächtig« fest, ansonsten könnte einer Ordnung, die immer nur sich selbst ist, weder die Möglichkeit zu lernen noch eine Entwicklung des Wissensmöglichen eingeräumt werden. Aber dieses Attribut wird nicht einer ideellen Regelmäßigkeit zugeschrieben, sondern den Weisen, wie sich aus der Diskretheit des Symbolischen regelmäßige Kontinuitäten (also objektive Formen) identifizieren lassen. Form wird hier zu einer Verträglichkeit in einer Art verteilter Elementarität, die wir das Disparse nennen: Jeglicher symbolisch markierbare Punkt, der in einer Form mit anderen Punkten verbunden werden, also in eine Linie gebracht und als Kontinuität herausgestellt werden soll, ließe sich im Prinzip mit jeglichem anderen symbolisch markierbaren Punkt verbinden.
Die Vorstellung, welche die vorliegende Arbeit zu erörtern und zu verfolgen sucht, begreift mit Serres die Moderne als Sein von Neuigkeit und mit McLuhan die Medien als diejenigen Elemente, welche die Lebensformen in einer Ordnung, die immer nur sich selbst ist, zu steuern vermögen. Damit drängen sich erneut die inhaltlichen und instrumentellen Aspekte von den Botschaften der Medien in den Vordergrund. Mc Luhans Botschaften, als verkörpertes Modul zum Artikulieren von Maßgabe, so mein Vorschlag, wären besser als Nachrichten zu begreifen – als jene Nachrichten nämlich, welche in der Lage sind, das Sein der Neuheit (die moderne Gestalt des Städtischen) skandierend gleichermaßen nach zu richten wie vor zu prägen. McLuhans mediales Wirken der Botschaft – es führt zu Änderungen im Maßstab – artikuliert sich nämlich in den instrumentellen und inhaltlichen Aspekten von Medien, und zwar symbolisch: in einem Sinn, den wir vom Umgang mit Symbolen in der Mathematik kennen.
Ein Gefühl für diese Mathematizität des Symbolischen können wir aus Michel Foucaults Formulierungen gewinnen: »Wir sind«, so schreibt er, »in einem Moment, wo sich die Welt weniger als ein großes sich durch die Zeit entwickelndes Leben erfährt, sondern eher als ein Netz, das seine Punkte verknüpft und sein Gewirr durchkreuzt.«[11] Damit seien wir »in der Epoche des Simultanen«, wir seien »in der Epoche der Juxtaposition, in der Epoche des Nahen und des Fernen, des Nebeneinander, des Auseinander«.[12] Wenn wir die moderne, generisch-allgemeine Gestalt des Städtischen in ihrer mathematischen Disparsität begreifen, stellen diese Charakterisierungen weder eine Absurdität noch eine Illusion dar. Die Realität der verschiedenen technischen Infrastrukturen zur Zirkulation von Jeglichem, welche Foucault hier beschreibt, verdanken ihre manifeste Form allesamt der Mathematik, speziell der algebraisch-logistischen Verfasstheit der topologischen Struktur jener logistischen Netze. Anstatt eine Absurdität im Thema einer realen Simultanität herauszustellen, käme somit eine Absurdität in den Blick, welche die Vorstellung von einem Regelkreis selbst betrifft. Genauer formuliert bedeutet das: die Vorstellung, den Kreis – als Symbol für das Unendliche und Allumfängliche – über eine als generisch spezifizierte und ihm attestierte ideelle Regelmäßigkeit zu begreifen. Das Rationalisieren des Kreisumfangs (seine Quadrierung) hat eine ebenso lange Tradition wie das Vermessen seiner Fläche als Inhalt (seine Triangulierung), in welcher die Irrationalität des Messens offen zugestanden wird. Gleichwohl ist klar, dass die Genese der Mathematik in ihrer stets zunehmenden Mächtigkeit dank Abstraktion sich und ihre Fortschritte seit jeher dem Verfolgen genau dieser beiden Unmöglichkeiten verdankt. Das Postulat also, in einem Regelkreis wären Rationalität wie Irrationalität, Rechnen wie Messen gleichermaßen aufgehoben und zur Ruhe gekommen, heißt nichts weniger, als dass jene beiden für das Lernen doch so unendlich produktiven Unmöglichkeiten (Quadrierung des Kreisumfangs und Triangulierung der Kreisfläche) nicht mehr weiterverfolgt werden sollen. Damit wird das Lernbarmögliche zugunsten eines Status quo auf arbiträre Weise beschränkt. So betrachtet wird verständlich, warum wir, gerade um realistisch zu bleiben, die mathematischen Symbole auf eine kritische Distanz halten sollten.
Kommen wir also auf die einleitenden Gedanken zur Geschichte der Genese des Städtischen zurück und wenden unseren Blick – trotz der enormen Herausforderung – nicht sogleich ab vom autochthonen Selbstbegründungsanspruch moderner politischer Ordnungen, sondern setzen hypothetisch und dem dargelegten Gedankengang folgend die Mathematizität als das symbolische Sein von Modernität. Folgen wir ferner Serres darin, Modernität über ihren Anspruch an Ordnung durch Fluktuation als Sein von Neuigkeit zu charakterisieren, indem die Medien die Fluktuation moderieren, einrahmen und maßregeln. Was bei McLuhan einWirken der Botschaft war, wird somit zu einem Wirken der Medien, welches sich im Sein von Neuigkeit entfaltet und dieses in dessen Ausdruck als Nachrichten skandierend erdenkt und formuliert. So betrachtet haben Nachrichten in generisch-selbstbezüglicher Weise am Sein von Neuigkeit teil und können als Schema und Ausdrucksform des stetigen Sich-selbst-Werdens moderner Ordnung gelten.
Die Vorstellung also, von der diese Arbeit geleitet ist, betrifft die Möglichkeit einer Experimentalwissenschaft im Abstrakten. Sie folgt damit bis zu einem gewissen Grad den Vorstellungen einer materialistisch, respektive archäologisch ausgerichteten Medientheorie. Jedoch ist die Haltung der Autorin bestrebt, die in solcher Materialität oder Monumentalität gegebene Positivität nicht auf eine Natur von [diskursiver] Geschichtlichkeit zu beziehen, die es in Archivarbeit zu dokumentieren und [als Bestand] zu sichern gilt, sondern auf eine Natur von Intellektualität, die es als städtische Architektonik [preisend und immer wieder neu ermessend] zu charakterisieren gilt. Die Charakterisierung einer solchen Architektonik gäbe uns die symbolisch-algebraische Alphabetizität, in der sich das Städtische in seiner Allgemeinheit artikulieren lässt – generisch [und in diesem Sinn modern] und dennoch an der Singularität ihrer Natur festhaltend.
Im Zentrum einer solchen Architektonik stünde eine Theorie des Virtuellen, die jedoch bisher noch kaum anders denn als allgemeines Desiderat hat Gestalt annehmen können. Die Herausforderung einer solchen Theorie besteht darin, das Verhältnis zwischen den Kulturtechniken der Formalisierung und der Interpretation – oder anders gesagt: zwischen Rechnen und Schreiben – neu zu fassen. Ihr Erfolg hängt davon ab, einen Weg zu finden, die Universalität der algebraischen Symbole weder in der infiniten Mechanik des Arithmetischen aufzulösen, noch sie den Kategorien einer geometrisch oder begrifflich vermessenen Kosmologie unterzuordnen. Im Konzept des Virtuellen trifft, pointiert und in aller Kürze formuliert, das spirituelle oder theologische Problem der Unendlichkeit auf das Problem der zeichentheoretischen Referenz. Weit davon entfernt, die Perspektive einer medialen Architektonik irgendwie schlüssig, zwingend oder nur konsistent genug formulieren zu wollen, war es lediglich das Ziel der vorliegenden Arbeit, bestehende abstrakte Einstellungen zusammenzutragen, welche sich anbieten, als Herkunftslinien und Topoi einer medialen Architektonik charakterisiert und aufgegriffen sowie gegeneinander oder miteinander profiliert zu werden. Insgesamt vermögen diese gebündelten Linien, so die Hoffnung, das Risiko des eingangs erwähnten Sprungs – nämlich die Medienrealität im 21. Jahrhundert nicht primär auf epistemologische, rein historische und ästhetische oder direkt praktisch-politische Referenzebenen hin in den Blick zu nehmen, sondern sie in der Genese des Städtischen selbst zu ergründen – etwas zu entmystifizieren. Damit soll die tragende Rolle des Vorstellungsvermögens und der Wendigkeit im exakten Denken für eine städtische Architektonik plausibilisiert werden, um damit ihrer weiteren Entwicklung den nötigen Zuspruch wie auch die ebenso nötige ernsthafte Kritik zu sichern
Nachdem damit der allgemein thematische Horizont der vorliegenden Arbeit aufgespannt ist, sollen zwei Positionen aktueller Theoriebildung herausgestellt werden, welche für dieses Buch in besonderem Maße relevant sind: Foucaults archäologische Methode zu einer analytischen Geschichtswissenschaft unter dem Primat von unsteten und immer impliziten Machtdispositiven, aus denen heraus sich Dokumente in ihrem historischen Gehalt erschließen müssen, und Deleuzes Bestreben, das Erbe der Philosophiegeschichte, welches Alfred North Whitehead provokanterweise als »Sammlung von Fußnoten zu Platon«[13] charakterisiert hatte, um einenAtomismus der Einbildungskraft zu bereichern.
Als Ausgangspunkt gilt Foucault, so formuliert er in seiner Einleitung zu Archäologie des Wissens, dass sich die Vorstellung einer linearen Abfolge von Geschehnissen heute in ein »Spiel von in die Tiefe gehenden Loshakungen«[14] aufgelöst habe. Die Ebenen der Analyse hätten sich vervielfacht, »jede hat ihre spezifischen Brüche, jede umfaßt einen nur ihr gehörigen Ausschnitt«.[15] Foucault zufolge handelt es sich bei den »Gegenständen« einer solchen Analytik, unter der Annahme eines ins Viele verteilten historischen Apriori, um »architektonische Einheiten«,[16] die nicht über eine Beschreibung objektivierbarer Einflüsse, der Traditionen, der kulturellen Kontinuitäten auszumachen seien. Historikern stünden mittlerweile neue Instrumente für ihre Analysen zur Verfügung, die ihnen Anschlüsse an das Paradigma empirischer Statistik ermöglichten: »Modelle wirtschaftlichen Wachstums, Mengenanalysen des Warenflusses, Kurven über die Zunahmen und den Rückgang der Bevölkerungsziffer, Untersuchungen des Klimas und seiner Schwankungen, Ermittlungen soziologischer Konstanten, Beschreibungen technischer Anpassungen, ihrer Verbreitung und ihrer Beständigkeit.«[17] Nach Foucaults Methodik hat man es nicht mehr mit »Dokumenten« zu tun, die etwas Geschehenes »belegen«, sondern mit »Monumenten«, die etwas Geschehenes als Geschehendes konstituieren. Die Begriffe Wachstum, Fluss, Zunahme und Schwankung verweisen direkt auf die Änderung in der Zeit. Der hauptsächliche Unterschied dieser sprachlichen Neuerung (Dokumente als Monumente zu begreifen) besteht darin, dass sich die foucaultschen Monumente immer nur aus einer diesen Denkgebäuden immanenten Perspektive beschreiben lassen. »Man könnte, wenn man etwas mit den Worten spielte, sagen, daß die Geschichte heutzutage zur Archäologie tendiert – zur immanenten Beschreibung von Monumenten.«[18] Eben diese nicht hintergehbare Innenperspektive der historischen Analyse ließ ihn seine Methodik am Modell des mathematisch-naturwissenschaftlichen Differentierens und Integrierens entwickeln. Ausschlaggebend für die Adäquatheit einer Beschreibung ist somit, dem Vorgehen wissenschaftlicher Experimente analog, die Stabilität der empirisch entdeckten »internen Kohärenzen, die sich aus den postulierten Axiomen, den daraus ableitbaren deduktiven Ketten«[19], und den so erschließbaren Kompatibilitäten der »messbaren« Partikularien ergibt. Diese Partikularien gelten Foucault als »architektonische Einheiten«, und er nennt sie architektonisch, weil für ihre Analyse, wie er sagt, weder objektive Referenz noch subjektiv »das Gefühl oder die Sensibilität einer Epoche, nicht die ›Gruppen‹, ›Schulen«, ›Generationen‹, oder ›Bewegungen‹, nicht die Gestalt des Autors im Spiel des Austausches, das sein Leben und seine ›Schöpfung‹ verknüpft hat, sondern die einem Werk, einem Buch, einem Text eigene Struktur als Einheit nimmt«.[20] Das Problem betreffe nicht mehr die Tradition und Spur, so fährt er fort, sondern den Ausschnitt und die Grenze: »Es ist nicht mehr das Problem der sich perpetuierenden Grundlage, sondern das der Transformationen, die als Fundierung und Erneuerung der Fundierungen gelten.«[21] Foucault beginnt damit, Historizität in der Form moderner Ordnung, die wir als Sein von Neuigkeit charakterisiert haben, zu erörtern. Deshalb möchten wir vorschlagen, in seinen architektonischen Einheiten das Wirken der Medien aufzuspüren, indem wir diesen Einheiten den Status von Nachrichten zuschreiben. Das ist bei Foucault so nicht explizit angelegt, aber es scheint trotzdem seinem eigenen Verständnis recht nahezuliegen – charakterisiert er doch das Feld von Fragen, welches es durch seine archäologische Analyse architektonischer Einheiten eröffnet, in deutlich formatorientierter, medientheoretischer Sprache: »Wie soll man die verschiedenen Begriffe spezifizieren, die das Denken der Diskontinuität gestatten (Schwelle, Bruch, Einschnitt, Wechsel, Transformation)? Nach welchen Kriterien soll man die Einheiten isolieren, mit denen man es zu tun hat: Was ist eine Wissenschaft? Was ist einWerk? Was ist eine Theorie? Was ist ein Begriff? Was ist ein Text? Wie soll man Abwechslung in die Niveaus bringen, auf die man sich stellen kann und von denen jedes seine Skansionen und seine Form der Analyse besitzt: Welches ist das angemessene Niveau der Formalisierung? Welches das der Interpretation? Welches das der strukturalen Analyse? Welches das der Kausalitätsbestimmung?«[22] Foucault selbst hat die Stimme, der in solcher Analytik zugehört wird, als die unpersönliche Stimme eines »Murmelns der Diskurse«[23] bestimmt. Wir wollen vorschlagen, in dieser Stimme nicht den kontinuierlichen Hintergrund eines anonymen Murmelns zu vernehmen, sondern das beständige und überbordende Nachrichten einer Vielzahl von singulären Stimmen [die alle diskret vernehmbar sind und], welche alle zusammen in ihren je diskreten Weisen den Quantenstatus einer generischen Stimme ausdrücken, [In einer Anonymität des Diskurses vermag sich die moderne Allgemeinheit von Ordnung nicht selbst zu autorisieren, und ihr Selbstverständnis als authochthone Verfassung bleibt blosses Wunschdenken, eine geteilte Illusion. Nähert man sich diesem Murmeln aber mit diskretisierender Geste, und erachtet es als das zusammen klingende Verlauten vieler singulärer Stimmen welche den Quantenstatus in der sich die moderne Allgemeinheit von Ordnung als das Sein von Neuigkeit fortlaufend selbst autorisiert. Diese generische Stimme drückt sich in mathematischen Ordnungen aus, in denen sich das Sein von Neuigkeit in Nachrichten vermittelt: in der Technizität der Kommunikationskanäle und in medialen Formaten ebenso wie in den Gestaltungen jeglicher Artefakte.
Damit kommen wir zur einleitenden Darstellung der zweiten theoretischen Position, welche für die vorliegende Arbeit als wegweisend gelten muss: jene von Gilles Deleuze. Dieser hat mit seinem Atomismus der Einbildungskraft auf einen philosophischen Idealismus von Problemen hingearbeitet, welcher es erlaubt, wie hier ausgeführt werden soll, Historizität als Virtualisierung von Dialektik zu begreifen. Darin sehen wir erste Hinweise, wie sich die Modalität dieses Seins, des Seins von Neuigkeit, begreifen ließe. Deleuze postuliert ein Element des Ideellen, welches er das »Problematische« oder auch das »Informelle« nennt. Um dieses Element zu denken, so schlägt er vor, gelte es, die Figur des Wiederspruchs mit jener des mathematischen Differenzials zu prozeduralisieren.[24] Deleuze verabschiedet sich damit von der Vorstellung, dass sich eine Realität des Historischen unmittelbar erschließen ließe; für ihn lässt sich diese Realität in ihrer Objektivität nur auf einer Bühne des abstrakten Denkens begreifen. Deleuze bindet seinen Begriff von Realität damit an einen eminent praktischen Begriff davon, was es heißt, denkend zu sein: Es geht bei der Vorstellung von Realität nicht darum, ob sie richtig oder falsch, adäquat oder inadäquat repräsentiert sei, sondern wie reduziert oder reichhaltig an Differenziertheit sie sind. Dabei geht es nicht darum, Realität subjektiv als immer schon relativ zu begreifen, sondern alsunterschiedlich gesättigt an generischer Objektivität. In der Konsequenz eines solchen praktischen Begriffs von Denken schlägt Deleuze eine Neubesetzung der Rollen in der Dramaturgie von Historizität vor: Differenz soll nicht in erster Linie das epistemologische Primat von logischer Identität brechen. Vielmehr soll sie die das Denken bannende Vorstellung einer Realität des Negativen, welche jedes Vorhaben beherrscht, mit dem Primat logischer Identität zu brechen, entmachten. Differenz soll dem logischen Denken eine Virtualität des Reellen erschließen, die in ihrer symbolisch-mathematischen Natur als gleichermaßen unbestimmt wie bestimmbar zu gelten hat. Konkret schlägt Deleuze vor, wie später genauer ausgeführt werden wird, das nicht-A zu ersetzen durch die algebraische Formulierung des dx als symbolische Form eines Verhältnisses, welches von den Termen, in denen es sich konkret instanziiert, herausgelöst betrachtet werden kann. Das Anliegen von Deleuzes Philosophie ist deutlich verwandt mit jenem Foucaults, nämlich die Genese von Struktur verstehen zu lernen. Dafür, so legen beide nahe, muss das Verhältnis von Formalisierung und Interpretation reziprok gedacht werden, in der das eine das andere immer wieder zu neuer Mächtigkeit erhebt. Deleuze hat als vermittelndes Element in dieser Dynamik einen Atomismus der Einbildungskraft angenommen: Das Atom gilt ihm nicht als kleinster gemeinsamer Nenner, in dessen Element sich alles universell und völlig ohne Angleichung oder sonstiges Zutun entsprechen würde. Vielmehr sei das Atom »dasjenige, was nur gedacht werden kann«.[25] Das Atom als Referenz für Ähnlichkeit lässt sich in seiner Struktur nicht bloßlegen, sondern muss im Denken selbst erzeugt werden. Darin aber ist es nicht weniger reell: Für Deleuze gibt es eine Natur des Denkens, die ebenso schöpferisch ist wie die physikalische Natur und die ebenso wie diese Gegenstand von mathematischer Analyse und Synthese sein kann. So entwickelt er seine Philosophie denn auch als einen »fantastischen Mathematismus«[26], mit welchem, wie wir nahelegen möchten, Serres’ Modell von Ordnung, als das Sein von Neuigkeit-im-Allgemeinen, eine mediale Architektonik auszurichten imstande wäre. Wissenschaftliches Verstehen, Erforschen und Gestalten in diesem Sinne würde sich nicht unmittelbar als Natur-,Technik-, Kultur- oder Geisteswissenschaft begreifen lassen, sondern müsste sich primär als Wissenschaft des Städtischen verstehen und sich aus einer Natur von Intellektualität im Allgemeinen verpflichten.
Die vorliegende Arbeit gliedert sich in drei Teile. Im ersten Teil Virtualität und Medialität erfolgt weniger eine Argumentation für eine bestimmte Perspektive als eine einführende Darstellung der größeren Problematik, innerhalb derer Medien als Medien überhaupt ein eigenständiges Thema geworden sind. Das erste Kapitel widmet sich der Genealogie des Konzeptes von Medialität, welches im medienwissenschaftlichen Diskurs eng mit demjenigen der Virtualität verknüpft ist. Im zweiten Kapitel wird der Prozess der technischen Informatisierung alskulturgeschichtliche Wendezone beschrieben, welche den Cartesianischen Dualismus von ausgedehnten Dingen (Res Extensa) und Dingen die nur Denkbar sind (Res Cogitans) in eine Krise drängt. Und im dritten Kapitel wird das Problem des Verhältnisses von Information und Zeichen aufgegriffen und anhand der Frage nach der Referenzialtät von Zeichen diskutiert. Insgesamt werden so unterschiedliche Verzweigungen innerhalb der aktuellen medienwissenschaftlichen Literatur skizziert und einführend erläutert.
Im zweiten Teil Formen und Strukturen von Integrabilität wird das heute apostrophierte mediale Apriori in den kulturgeschichtlich umfassenderen Zusammenhang einer ganzen Genealogie von konkurrierenden Apriori gestellt. Im Kapiel Virtualität und Konstruktionsform erfolgt der Vorschlag, die mit diesen Apriori jeweils assoziierten Denkformen vor dem Horizont eines philosophischen Verständnisses von Virtualität als so etwas wie »Konstruktionsformen« zu begreifen, welche genuin theoretische Anschaulichkeit ermöglichen und so Medienwissenschaft als eine praktische Komparatistik orientieren könnten. Im Kapitel Zum Topos der Begrenzunggreifen wir die Grundannahme von Foucault auf, dass es in der Auseinandersetzung mit Historizität heute mehr als um alles andere um Ausschnitte und Grenzen gehen muss. Dieses Kapitel widmet sich mit perspektivierendem Gestus und einem analytisch-archäologischen Blick einigen Formen und Strukturen von Integrabilität und den damit assoziierten Topoi von Begrenzung. Im Kapitel Funktion, Sinn und Form wird das von Deleuze philosophisch gefasste Differential in seiner mathematikgeschichtlichen Herkunft eingeführt und in seiner Eingliederung in den philosophischen Horizont kontextualisiert. Abschließend erfolgt eine Diskussion, inwiefern der für Deleuzes Differentialphilosophie konstitutive Begriff des Virtuellen eine Orientierung verspricht hinsichtlich jener Selbstbezüglichkeit und scheinbarer Unbegründbarkeit, wie sie mit dem Aufkommen modernen Ordnungsvorstellungen und ihrer jüngeren Instanziierung in den diversen logistischen Netzwerken einhergehen.
Im dritten und letzten Teil werden die beiden Perspektiven des ersten und zweiten Teils hinsichtlich des scheinbar unauflösbaren wie auch in absoluter Weise unbegründbaren Verhältnisses von Theorie und Synthesezusammengeführt und vor dem fantastisch-spekulativen Horizont einer Virtualisierung von Dialektik erörtert. Hier erfolgt die medientheoretische Diskussion von Themen rund um das Paradigma von logistischen Netzwerken, probabilistischen Analysen und Modellierungen, kalkulierbaren Simulationen und bildgebenden Rendering-Verfahren.
Das Buch schließt mit einem Vorschlag, wie sich ein generischer Begriff von Medialität fassen ließe: als abstraktes Differential von Zeichensituationen-im-Allgemeinen – das heißt Zeichensituationen in einen unendlich reichen Zusammenhang gebracht, dessen Charakterisierungen sich experimentell analysieren und formal explizieren lassen. Hier werden die eben dargelegten Gedanken zur Genese des Städtischen und insbesondere zu deren gegenwärtigen Form als generische Urbanität (das Städtischen im Allgemeinen) noch einmal aufgegriffen. Ein generischer Begriff von Medialität müsste, so soll nahegelegt werden, mit der generischen Form des Städtischen nicht länger in einem konsumtiven Widerspruch stehen. Beide könnten sich reziprok in einer Entwicklung bedingen, die singulär, unbestimmt und offen ist, ohne bedingungslos zu sein – das heißt in einer Entwicklung, die zugänglich wäre für Kritik. Wir hatten insistiert, dass auch die generische Form des Städtischen die Singularität des Städtischen nicht preisgeben sollte, und vorgeschlagen, das Reale dieser Singularität in einer Chiffre verkörpert zu sehen. Die Möglichkeit einer unbestimmten und offenen Entwicklung, die dennoch nicht bedingungslos wäre, hängt davon ab, das Reale, welches diese Chiffre verkörpert, weder unmittelbar noch unkritisch und fantasielos-schematisch zu erörtern, charakterisieren und bestimmen zu suchen. Stattdessen gilt es, das Reale (symbolisch gefasst als Chiffre) auf der Bühne eines Denkens zu thematisieren, das sich seiner notwendigen Abstraktheit (besser wäre: »Abstraktivität«) kritisch bewusst ist und weiß, dass Ideenreichtum und Realitätsgehalt sich für eine gegenseitige Entsprechung in Form bringen müssen, bevor sie etwas miteinander zu tun haben können.
[1] Robert Musil, Der Mann ohne Eigenschaften, Band 1: Berlin 1930, S. 8/9.
[2] Ebenda, S. 9.
[3] Michel Serres, »Anfänge«, in: ders. und Ilya Prigogine, Isabelle Stengers, Serge Pahaut, Anfänge, Berlin 1991, S. 18.
[4] Marshall McLuhan, Die magischen Kanäle, Düsseldorf 1992, S. 18.
[5] Michel Serres, »Anfänge«, S. 18.
[6] Marshall McLuhan, Die magischen Kanäle, S. 18.
[7] Hier sei stellvertretend genannt: Georg Christoph Tholen, »Platzverweis. Unmögliche Zwischenspiele von Mensch und Maschine«, in: Norbert Bolz, Friedrich Kittler und Georg Christoph Tholen (Hrsg.), Computer als Medium, München 1994, S. 110–135, hier S. 114; vlg. zur allgemeinen Bedeutung, die McLuhand heute zugesprochen wird auch Derrick de Kerckhove, Martina Leeker und Kerstin Schmidt (Hrsg.), McLuhan neu lesen. Kritische Analysen zu Medien und Kultur im 21. Jahrhundert, Bielefeld 2008.
[8] Marshall McLuhan, Die magischen Kanäle.
[9] Erich Hörl, Die heiligen Kanäle. Über die archaische Illusion der Kommunikation, Zürich und Berlin 2005, hier zitiert vom Klappentext.
[10] Michel Serres, »Anfänge«, S. 18.
[11] Michel Foucault, »Andere Räume«, in: Karlheinz Barck u. a. (Hrsg.), Aisthesis. Wahrnehmung heute oder die Perspektive einer anderen Ästhetik, Leipzig 1990, S. 34–46, S. 5.
[12] Ebenda.
[13] Alfred North Whitehead, Process and Reality, New York 1992 [1929], S. 39.
[14] Michel Foucault, Archäologie des Wissens, Frankfurt am Main 1981 [1969], S. 9.
[15] Ebenda.
[16] Ebenda, S. 12.
[17] Ebenda, S. 9.
[18] Ebenda, S. 15.
[19] Ebenda, S. 12.
[20] Ebenda.
[21] Ebenda.
[22] Ebenda.
[23] Michel Foucault, Die Ordnung des Diskurses, Frankfurt am Main 1991, S. 32.
[24] »Wir stellen Nicht-A dx gegenüber, und entsprechend dem Symbol des Widerspruchs das der Differenz […] – und ebenso der Negativität die Differenz an sich selbst. Freilich sucht der Widerspruch die Idee seitens der größten Differenz, während das Differential Gefahr läuft, in den Abgrund des unendlich Kleinen zu stürzen. Das Problem ist damit aber nicht richtig gestellt: Es ist falsch, den Wert des Symbols dx mit der Existenz der Infinitesimalen zu verbinden; aber es ist ebenso falsch, im Namen ihrer Ablehnung jenem Symbol jeglichen ontologischen oder gnoseologischen Wert zu verweigern.« Gilles Deleuze, Differenz und Wiederholung, München 1992 [1968], S. 220.
[25] Ebenda.
[26] Ebenda.
Module 7 MAS 2013/4 // Architecture and I
Information Societies and the Question of Subjectivity and Identity
MEDIACY, COMMUNICATION AND MODERNITY
We meet once a week on Monday afternoons, in order to encode and decipher together some of the »intellectual magnitudes« that are formalized in the key vectors which constitute the kaleidoscopic theory-landscape of postmodern, poststructuralist, and media theoretic discourses.The meetings will be organized according to particular reading assignments. We will have 2 hours lecture and 2 hours discussion in each session. You are expected to prepare yourselves by reading the assigned texts in a »data-literate manner«. What does that mean? I also don’t know. On a methodological level, the Seminar is an exercise in finding out more about this. Some preliminary guidelines: don’t (primarily) try to grasp the depth of the arguments discussed. Exercise in how to encounter a text in a formal manner first. This requires fast reading. Look at each text as an »intellectual body« that allows you to sense certain things, and to do certain things – characterize in your own words what these bodies allow you to engage with, and how. Read the titles as what they literally are: Formal registers that collect and express particular capacities. Contemplate the table of contents. Again and again. Imagine what the chapters and paragraphs might be about. Scroll and read through the chapters and paragraphs while keeping their titles (as well as the position of the titles within the tables of contents) in mind. Try to develop a sense for how value-attributing weights are distributed in the core concepts of the texts. Use computational tools to get organized in your circular and contemplative readings: make maps of the frequencies of used words, of the activities (verbs) attributed to the themes (nouns), of the qualifications attributed to the activities and the nouns. Pay attention to how passivity (receptivity) and activity (agency) are organized. Prepare to share these readings in the discussions (organize keywords, attribute labels to them, present to the group your characterization of the »intellectual bodies« that you began to get familiar with in your reading.
The readings are provided in electronic form on the MAS server.
*************************************************************
Session 0 // Mo July 7th 2014
Within the quantum paradigm
READINGS
Constituting Objectivity
M. Bitbol et al. (eds.), Springer 2009
(Introduction, ca 20 pages)
What is the Measure of Nothingness? Infinity, Virtuality, Justice
Karen Barad, Hatje Cantz 2012
(ca 15 pages)
*************************************************************
Session 1 // Mo July 14th 2014
Orality, literacy, and the discordant practices in »media-based communication«
READINGS
The Gutenberg Galaxy. The Making of Typographic Man
Marshall McLuhan (1965)
(250 pages)
The Literacy Episteme: From Innis to Derrida
Jens Brockmeier and David R. Olson, in: The Cambridge Handbook of Literacy (ed. by David R. Olson and Nancy Torrance, Cambridge UP 2009.
(20 pages)
*************************************************************
Session 2 // Mo July 21st 2014
The dawn of anthropocentrism and the settlement of age and nonage (Unmündigkeit) into legal terms
READINGS
What is Enlightenment? (1784)
Immanuel Kant
(ca 10 p.)
Transcendental Doctrine of Elements (1781)
Immanuel Kant
(ca 50 p.)
The Architectonic of Pure Reason (1781)
Immanuel Kant
(ca 10 p.)
System of Cosmological Ideas (1781)
Immanuel Kant
(ca 10 p.)
*************************************************************
Session 3 // Mo August 4th 2014
The transcendental subject and anthropology, or some lemmata of linking a »dead corpus« (legal terms) with a »quick organon« (natural faculties of reasoning)
READINGS
How to make our ideas clear
Charles Sanders Peirce (1878)
(ca. 20 p.)
Chapter “Conclusion” from Difference and Repetition (1968)
Gilles Deleuze
(excerpts, ca 20 p. to be announced)
*************************************************************
Session 4 // Mo August 18th 2014
Post Modernity
READINGS
Mathematics and the Roots of Postmodern Thought
Vladimir Tasic, Oxford UP 2001
(150 p.)
The Postmodern Condition. A Report on Knowledge (1979)
Jean-Francois Lyotard
(110 S.)
*************************************************************
Session 5 // Mo August 25th 2014
The materiality of discourse and the mercantilization of knowledge
READINGS
We have never been modern
Bruno Latour
(150 S.)
*************************************************************
Session 6 // Mo September 1st 2014
»Bodies-to-think-in« as »communicational subjectivities«
READINGS
EigenArchitecture. Cultivating the generic. A mathematically inspired pathway for architects.
Ludger Hovestadt (in Ludger Hovestadt and Vera Bühlmann (eds.), EigenArchitecture, ambra 2013)
Articulating a thing entirely in its own terms, or what can we understand by the notion of »engendering«?
Vera Bühlmann (in Ludger Hovestadt and Vera Bühlmann (eds.), EigenArchitecture, ambra 2013).
*************************************************************
Session 7 // Mo September 8th 2014
Universal Rights: Cosmo-Literacy, and the legal conditions for a home of subjects that are coming-of-age (mündige Subjekte)
READINGS
The Natural Contract (1990)
Michel Serres
(150 S.)
Metalithicum Colloquy #5 Coding as literacy: Self Organizing Maps
Recent advancements in computer science, namely in data-driven modeling techniques, have opened up a new level of design culture. In this conference we would like to consider and discuss how to theorize and apply highly abstract computational modeling procedures in architectural design.
As architects, computer scientists and philosophers we think that the state of the art procedures, as they are implemented today in CAAD tools such as scripting libraries and parametric design environments, are undoubtedly of enormous power. Yet we think that they have potentials and limits which need to be discussed independent of purely pragmatic questions like those related to computation power, the availability of programming skills among architects and engineers, or a sudden plenty of available data to work with. How to deal with computational procedures touches upon many of the issues at stake in the old disputes around architecture as art vs. architecture as science, but we don’t find this question adequately addressed by either one camp. So we would like to consider the implications involved with an open mind by relating this question to the philosophical approaches to computability and calculate-ability more abstractly, and with that, to the relation between mathematics and concepts. More precisely, we would like to ask how the genuinely synthetic character of the computational procedures may be theorized not in terms of a formalist and representational paradigm on the one hand, nor in a strictly case-based analytical paradigm on the other, but in terms of what we might preliminarily call ‘computation literacy’.
With this aim, we would like to raise one particular procedure into the exemplary status as a point of reference for our discussions: Self Organizing Maps (SOM), a procedure which has been introduced 30 years ago by Teuvo Kohonon. It is a generic and promising procedure, which, as we think, is somewhat restrictively and inadequately viewed as a kind of neural network. Based on our theoretical approach, and on some practical experiments, our hypothesis is that it has particular capacities in relation to data-driven modeling that seem yet to be largely unexplored. If we try to understand SOM‘s abstractness similar to how we understand that of Markov Chains, SOM promises to be of comparable importance as the latter (in terms of general applicability). Our core interest is to discuss how we could grasp this level of abstractness, and with what benefits and what costs this would be related.
Following this practical and theoretical interests we would like to discuss on three levels:
- Section one: Self Organizing Map (SOM) as a generic computational modeling tool .
- Section two: Discussions on the theoretical and philosophical status of abstract/ universal algebra in general, and on the approach of viewing the operational level of computation as literacy in particular.
- Section three: Possible applications in CAAD and urban design using SOM or similar tools for data-driven computational modeling methods.
Program May 22nd – 24th, 2014
Thursday 22nd of May 2014
13:00-13:30 Welcome
Dr. phil. Vera Bühlmann and PhD Candidate Vahid Moosavi
13:30-14:30 INTRODUCTION – CODING AND ARCHITECTURE
Prof. Dr. Ludger Hovestadt
Chair for Computer Aided Architectural Design, CAAD, ITA, ETH Zurich, www.caad.arch.ethz.ch
14:30-15:30 Discussion
15:30-16:00 Coffee
16:00-17:00 WARM UP ONE – PROFILING KEY CONCEPTS IN CATEGORY THEORY
Prof. Michael Epperson and Dr. Elias Zafiris
Center for Philosophy and the natural Science, College of Natural Sciences and Mathematics, California State University, Sacramento, USA, http://www.csus.edu/cpns/epperson/ and Department of Mathematics at the University of Athens http://users.uoa.gr/~ezafiris/
17:00-17.30 Discussion
17:30-18:30 WARM UP TWO – PROFILING KEY CONCEPTS IN CONTINUOUS GEOMETRY
Prof. Sha Xin Wei
Director School of Arts, Media and Engineering, Herberger Institute for Design and the Arts, Arizona State University, Founding Director Topological Media Lab, Concordia University, Montreal. http://ame.asu.edu/directory/selectone.php?ID=97
18:30-19:00 Discussion
20:00 Dinner
Friday 23rd of May 2014
Section 1: SOM Technology
09:00-10:00 SELF-ORGANIZING MAP AS A MEANS FOR GAINING PERSPECTIVES
Prof. Dr. Timo Honkela
Department of Information and Computer Science, Alto University, http://users.ics.aalto.fi/tho/
10:00-11:00 Discussion
11:00-11:30 Coffee
11:30-12:30 SELF-ORGANIZING MAPS AND LEARNING VECTOR QUANTIZATION FOR NON-STANDARD DATA
Prof. Barbara Hammer
CITEC centre of excellence, Bielefeld University, D-33594 Bielefeld, Germany, http://www.techfak.uni-bielefeld.de/~bhammer/
12:30-13:30 Discussion
13:30-14:30 Lunch
Section 2: Philosophy of Modeling and Computing
14:30-15:30 THE PRACTICAL PROBLEM OF CALIBRATING TOPOLOGICAL DYNAMICS AGAINST SOCIO-CULTURAL & HISTORICAL PROCESSES
Prof. Dr. Sha Xin Wei
Director School of Arts, Media and Engineering, Herberger Institute for Design and the Arts, Arizona State University, Founding Director Topological Media Lab, Concordia University, Montreal
http://ame.asu.edu/directory/selectone.php?ID=97
15:30-16:30 Discussion
16:30-17:00 Coffee
17:00-18:00 SHEAVES OF BOOLEAN LOGICAL FRAMES: A LOCAL-TO-GLOBAL APPROACH TO QUANTUM GEOMETRY AND LOGIC
Dr. Elias Zafiris
Department of Mathematics at the University of Athens
http://users.uoa.gr/~ezafiris/
18:00-19:00 Discussion
20:00 Diner
Saturday 24th of May 2014
Section 3: Possible Applications of SOM and Similar Methods in CAAD
9:00-10:00 TRANSFIGURING PHYSICAL AND ABSTRACT SPACES: A HIGH-DIMENSIONAL APPROACH
Dr. André Skupin
Department of Geography San Diego State University,
http://geography.sdsu.edu/People/Pages/skupin/
10:00-11:00 Discussion
11:00-11:30 Coffee
11:30-12:30 DATA DRIVEN MODELING BEYOND IDEALIZATION
Vahid Moosavi
PhD Candidate at the Chair for Computer Aided Architectural Design, CAAD, ITA, ETH Zurich, www.caad.arch.ethz.ch, Researcher at Future Cities Laboratory, Singapore-ETH Centre, www.futurecities.ethz.ch
12:30-13:30 Discussion
13:30-14:30 Lunch
Section 2: Philosophy of Modeling and Computing (continued)
14:30-15:30 THE ONTOLOGY AND EPISTEMOLOGY OF INTERNAL RELATIONS: BRIDGING THE PHYSICAL AND CONCEPTUAL IN QUANTUM MECHANICS AND QUANTUM INFORMATION
Prof. Dr. Michael Epperson
Center for Philosophy and the natural Science, College of Natural Sciences and Mathematics, California State University, Sacramento, USA, http://www.csus.edu/cpns/epperson/
15:30-16:30 Discussion
16:30-17:00 Coffee
17:00-18:00 THE BODY OF THAT CIPHER WHICH RENDERS PRESENT »THE WORLD«, OR A CIRCLE THAT CONTINUES TO COMPREHEND ITSELF. LOUIS HIJELMSLEV’S IDEA OF »AN ALGEBRA IMMANENT TO LANGUAGE« – SUITABLE FOR CHARACTERIZING DISCRETIZED PROBABILITY DENSITY AS A KIND OF ALPHABETICITY?
Dr. phil. Vera Bühlmann
laboratory for applied virtuality, Chair for Computer Aided Architectural Design, CAAD, ITA, ETH Zurich,
https://appliedvirtualitylab.wordpress.com
18:00-19:00 Discussion
apero rich and farewell
Stiftungsbibliothek Werner Oechslin // Werner Oechslin Library Foundation in Einsiedeln, Switzerland
Since 2010, the lab organizes the Metalithicum Colloquies in collaboration with the library.
From the website (www.bibliothek-oechslin.org)
The beginnings of the Werner Oechslin Library go back to a time when the study of primary sources and confirmation of findings through consultation of the originals was rarely considered important and, indeed, held in high esteem only in special fields of research. Today, attitudes towards research based on primary sources have changed radically. Uncertainty and reorientations in the humanities demand a (re-)examination of their foundations. This has led to re-evaluations and even rediscoveries. The interest in primary sources and the insight into the necessity of their study – especially regarding forms of thought, scholarly models, and attempts at integrative understanding and comprehension – is now bigger than ever, and still growing.
Thanks to years of patient preparation, the establishment of the Werner Oechslin Library has created an instrument that accommodates the needs of primary source research, responding to the interest in scholarship based on original texts. Large parts of the library have been located in Einsiedeln since 1980. This stock was considerably increased after 1985 when Werner Oechslin returned from years working in Italy, the USA, and Germany; since then the library has been systematically augmented. Encouragement from many quarters prompted the decision to turn the private library into a public institution, making it available to a larger audience of scholars. The architect Mario Botta designed a project for a new library building already in 1996; this was then constructed in several stages, surmounting numerous difficulties, over the course of the following years. On 9 June 2006 this unique library was inaugurated by Pascal Couchepin, then-member of the Swiss federal council, along with 160 other guests.
Werner Oechslin on order in libraries, forms of knowing, cosmology and architectonics (his lecture from our 4th metalithicum colloquy)
The Quickness of Matter, Doped in Its Polyalphabetic Textuality or The Articulation of Articles, beyond Prescript and Postscript
A short text that articulates our interest in “writing” as a key topos of research:
(published in Manuel Kretzer, Ludger Hovestadt, Alive: Advancements in Adaptive Architecture, ambra 2014 (forthcoming))
*********************************************
“As an example of human achievement,” John Orton maintains in his book Semiconductors and the Information Revolution: Magic Crystals That Made IT Happen, the semiconductor ought to “rank alongside the Beethoven Symphonies, Concord, Impressionism, medieval cathedrals and Burgundy wines and we should be equally proud of it” (Orton 2009, p. 2). Why is it, indeed, that this demand feels odd? Of course this lack of appreciating our current form of technics is owed partially to its abstractness and the degree of expertise it seems to demand from us. But has this not been the case for any of the abovementioned artifacts we all meanwhile hold as precious and dear? An understanding of how semiconductor electronics works, what it is conditioned by, and to which ends we might be able to cultivate it, hold a promise of no lesser enjoyment: “I only hope that my attempt to explain something of its appeal will help the layman to obtain the same kind of enjoyment from an understanding of semiconductor electronics that he or she might experience in contemplation of any of these [achievements]” (Orton 2009, p. 2).
Alive: Advancements in Adaptive Architecture demonstrates in exemplary manner how architecture currently sets out to explore its own quick and vibrant reality—a reality that is saturated by electronic currents and metabolizes a proper, immanent kind of actual and virtual activity, an activity proper to built environments that can be coded to behave, in principle (if not, at least for the time being, altogether in practice) ad libitum. Especially in this context, I will argue, Orton’s question deserves our unbound attention. In addition to issues of abstractness and expertise, there seems to be an obstacle in guiding our ambitions, as laypeople, toward learning to appreciate our most recent expressions of art and technics that seems more profound. There is something inherently uncanny implied in what Orton demands, which I would tentatively characterize as the waking up to a kind of neo-Babylonian confusion: as architects, scientists, economists, engineers, designers, we are learning to speak our common “language,” the language of mathematics, information, and code, in many different tongues.
Raising the topos of the Tower of Babel, and the confusion we are allegorically said to have inherited from it, would be no news in itself if the situation would concern the many manners of how people speak about the things and the realities in which they live, or if architects were speaking about the structures and buildings they erect. In language, we are today ready to generously grant that sense can and shall be made in manifold and arbitrary manner. This generosity can be granted, we feel, because the mathematical and formal descriptions of things chemical, physical, or biological are capable of unambiguous representation—if not yet entirely pure and perfect, so at least with an increasingly greater and greater degree of approximation. It is in regard to this, I would like to suggest, that we seem to be caught up in a neo-Babylonian kind of situation: matter, like language formerly, can meanwhile be articulated in manifold manners, and none of their articulations can be regarded as strictly equivalent with all the others. In short, while the former Babylonian confusion meant that we have many names for the same thing, the confusion of our times inverses the situation: we now have many things for the same name. Matter that is informed can be assumed to exist in pure and original form as little or as much as this can be assumed of language.
But still, why this concern with language and text, when our declared interest is in semiconductor electronics? Even in a literal (non-allegorical) manner—and this in a time when everyone is fascinated by the activity and agency proper to objects that are networked, objects that individuate within particular environments. Our interest with this concern is in learning to obtain reflected and critically distanced enjoyment from understanding these “things in their quickness”— an enjoyment like the one we have learned to obtain from appreciating other cultural artifacts aesthetically. We take aesthetically thereby to mean, without being subjected to the spells a cathedral (for example) casts on its visitors, as long as it is “alive” in all its not-explicitly mediated symbolic corporeality. However, since the eighteenth century, aesthetics has been much preoccupied with registers of form—formal (irr)regularity and expressions of aesthetical content in various gestalts and styles. With regard to our interest in these quickened pieces of matter, the emphasis comes to lie somewhere different: aesthetics, applied to the metabolism of things through their electronic circuit, must relate to the contractual discreetness of symbolized quantity much more than the regular continuity of form. It is with this emphasis on discreetness and distributedness that the “originality” of these electronically quick things we are interested in can be said to be more straightforwardly “textual.” Because what makes any of them possible is that the symbolic notations of algebra are the “textual” substrate of our computational procedures: without algebra, no domestication of electricity; without electricity, no synthetic chemistry; without Boolean algebra, no coding of electrical current. In short, without algebra, no programming that comes anywhere close to the sophistication we have grown capable of. Without programming, no informing of materials in their chemical and subatomic consistency. It is within algebraic formulation that the mathematical quantity of “information” complements and specifies the physical quantities of generic “matter” and “energy” (let us recall Norbert Wiener’s famous dictum that “information is information, not matter or energy” [1948, p. 55]). It is within algebraic formulations that light can be energized and turned into specified matter in physics that operates on a quantum level (Feynman 1985). To put it drastically: the manner in which we formulate all things, today, is algebraic (formulaic and equational) before we can think of it as formal (functional, a specification of direction within a formula). And for this reason, algebraic text is very different from aligning words into sentences and developing paragraphs to build upon each other to manifest an argument. Algebraic text essentially means to develop an equation: to spell out a space of reciprocal transformability between two sides that we want to consider as equivalent. Algebraic text is like the constitution that makes it possible to formulate reasonable sentences in discourse. In this, we can see the structure of our new Tower of Babel, where one and the same word relates to many things (as opposed to many words referring to the same thing).
In its literal meaning, algebra signifies the reunion of broken parts. Thus we might hold against the point this text tries to make: are we not then living in an era in which the legendary Babylonian confusion can finally be fixed and undone, rather than waking up to a neo-Babylonian confusion?
Indeed, we must not look far to find all sorts of spiritualistic phantasms that nourish and prosper from the fact that we are “communicating” today “mathematically.” In such communications, it appears, we can devote ourselves to our intellectual appetites without worrying about hubris and the illegitimate acclamation of divine power, because those intellectual appetites are the appetites of reason-in-general.In the beginning was not the word, we can read in the positions of many atheist stances today, but information; not an evocation on the basis of subjectivity, but an objective quantity. Some people go as far as claiming that the physical laws of conservation ought to be subjected to the Laws of Information, which are claimed to be more comprehensively natural.
But in all this enthusiasm, at work is a blind spot to which Jacques Derrida has, to a certain degree, attempted to draw our consideration. General reason reasons about life in general, he claims. And life-in-general cannot possibly be alive. The problem at stake is the very nature of the units with which such reasoning proceeds; that is, the nature of information. Information, as a unit for mathematical communication, cannot have a positive identity—it is what it is precisely because its nature is sheer determinability without essential content. Thus, what does it mean to live intellectually, he asks, in the scene of archi-writing rather than in the legacy of an original text? With information, communicating—literally fitting-together what is essentially disparate—spreads through and characterizes all things on the level of their energetic makeup.
Text that is produced, in Derrida’s scene of archi-writing, is not anymore text that is meant to passively preserve, for times to come, a present moment that is already and forever past: rather, it is completely unbiased and open for realizing any kind of sense that may be made. This is because, he maintains, in the beginning we do not find self-evident presence, but “original prints”: we must learn to think, somewhat paradoxically, that everything begins with “re-production”(Derrida 1972, pp. 84ff). Rather than capturing something that is not meant to remain—a sound, a word—writing ought to be liberated from its obligations as merely representing speech and attend mostly to reproduce from the stock that has already been written, and to concentrate on self-preserving as much of it as possible through repetition. Repetition proceeds in a circuitous path, and in proceeding like this, it alone is capable of instituting a postponement, a deferral, which—and this is the nucleus of Derrida’s argument—is capable of giving the space a thing takes, according to whatever gestalt it might adopt throughout each of the numerous acts of intellection in which its differential identity is being considered and appreciated. What we do in writing, Derrida argues, is not articulating a thing’s identity by voicing it, but inscribing a thing’s locus in a time and in a space that is only “there” and “actual” in remembering. Hence, we are literally in-scripting the possibility for a thing to remain present—intellectually. This is what he calls spacing.
In a strikingly straightforward manner, this view seems to find its positive concretization in the technical substrates on which our real infrastructures run today: printed electronics as a truly generic materiality that might be inscribed (coded) to perform in any manner imaginable. In an almost literal sense, printed electronics presents us a textual kind of materiality where each produced piece spaces out a possibility.
But Derrida’s spacing—and this is crucial for his post-structuralist thinking—is symbolic in a non-physical, non-corporeal, non-positive sense. The spacing of course inscribes itself into a kind of “materiality”—yet it is not that of a sound or a phoneme. Derrida imagines the alphabet decoupled from its relation to the vivid bodies of actual sounds, and instead raised into an infinite and combinatorial mode. In other words, the alphabet is turned into a form that generalizes all the spellings and articulations possible within it: the alphabet is considered as the alphabetical. To keep speed—movement—and hence allow for (combinatorially) new inscriptions and (combinatorially) new things, we ought to treat all things actually physically manifest as dead. In order to keep intellectual originality alive and quick, we must defend its liveliness by building stocks of the original memory-force. Derrida’s argument is a complicated one, and it would be far too ambitious to attempt to discuss it here with any appropriate amount of care. But what I would like to take from it is its elevation of the phonetic alphabet into a more abstract and symbolical level; however, if these considerations are meant to help find a way of appreciating algebra as language, things as technical as its articulations, and its articulations (e.g., semiconductor electronics) as ranking alongside Beethoven’s symphonies, then we have to depart from Derrida’s position at the point where he considers this more abstract level in reified and apparatus-like form, as the alphabetical. Instead, we can consider it as the template of a plurality of alphabets of coding, and like this we can connect his line of reasoning with our interest in the generic materiality of semiconductors.
In electronics, I would like to suggest, it is Derrida’s alphabetical that is multiplied and raised into an infinitary mode. Derrida’s interest is to mark out that the letters in an alphabet apparatus are not characters properly, but ciphers that depend upon deciphering. Yet the limit of his point of view is that he relates the deciphering to the preservation of the alphabetical order that articulates units of sound, not units of energetic electrical current—his writing seeks to trace sense in its absence, as grammé. Yet electrical current circulates in symbolic inscriptions, and exchanges quantums of potentiality. Here, the means of expression is not the letters of the phonetic alphabet, but an open number of alphabets of coding.
Thus, how might we learn to make sense of such a notion of algebraic text? How might we learn to make sense of it in a manner that can be captured neither in terms of prescripts (formula as laws) or postscripts (tracing a text’s sense in the absence of its originality)? If we are to learn to appreciate aesthetically and critically the impressive and fascinating quickness of matter today, we ought to shift registers from representation to saturation in how we think about text, form, and quantity.
REFERENCES
Derrida, Jacques (1972) “Freud and the Scene of Writing.” Yale French Studies, “French Freud: Structural Studies in Psychoanalysis,” no. 48, pp. 74–117.
Feynman, Richard (1985) QED: The Strange Theory of Light and Matter. Princeton, NJ: Princeton University Press.
Orton, John (2009) Semiconductors and the Information Revolution: Magic Crystals That Made IT Happen. Elsevier: Academic Press.
Wiener, Norbert (1948) Cybernetics, or Control and Communication in the Animal and the Machine. Cambridge, MA: MIT Press.
Theory as pockets or folds of knowledge, or The atom is that which must be engendered by thought
The applied virtuality theory lab is situated at the Chair for Computer Aided Architectural Design at the Institute for Technology in Architecture, Swiss Federal Institute of Technology ETH in Zurich.
Digital architectonics, pre-specific modeling
How do we see the stakes attaching to these terms? We see them as promising to help finding an abstraction from the meanwhile classical distinction in cultural studies and anthropology between the Renaissance disegno tradition of modeling in how think about technics, and the contemporary interest from a media archeological angle in thinking about technics as cultural techniques. But why seeking to find an abstraction from this distinction at all? While the former cannot seem to rid itself from an understanding of the artist as place holder of divine spirit on earth, the latter seeks in reaction to this, and rather aggressively, to annihilate the strikingly real and even ordinary experience that there is something like richness in ésprit, wit, capacity in intellection (what in German is best captured in the adjective “geistreich“). We want to maintain that the materialist point of view in the cultural techniques approach ought to be opened up by a contemporary atomist approach, one which conceives of the atom not as the basic common denominator that constitutes all things in their material terms of extension in space and time, as action (matter and energy), but as the basic common denominator that factors in, symbolically (information), in the extensional and active constitution of all things. In short: the atom as that which can only be thought. With this interest, we follow with great curiosity the discourses in media and cultural studies, science-and-technology studies, as well as in anthropology; but our interest remains attached to a certain esteem for idiosyncratic engagements which we see at the heart of all learning and acquisition of mastership.*
*(mastership form German: das Können; alternative translations like skill, know-how, dexterity all swallow up and annihilate the abstraction and intellection that crucially factors in, in whatever we can call “das Können”. Hence, for the time being, we chose to put up with this difficult and – if simplified- somewhat misleading term “mastership”).
Statement of interest
Research and teaching at the laboratory for applied virtuality draws from the cultural wealth, both historically and contemporarily, of forms of knowing and learning. We view these treasures in a sportive way and want to learn what kinds of strength and abilities they allow us to develop and acquire when integrating their diverse schemas, and when learning to understand better the peculiar stories each of them is telling – very often by fighting over right definitions of commonly used concepts and notions. Our perspective onto them is more akin to the one we take when hiking in the mountains, cultivating good cooking, or when working out in a gym: the performed movements (mentally translate to: movements in thinking) are not for the sake of the value of the precise movements themselves, but for the overall fitness, agility and pleasure which exercising will bring you, individually.
By pursueing a comparatistic approach in learning to understand structurally the symbolic forms of thought, and their varying codification and systematic integration into different theories, we seek to engage with theory as pockets or folds of knowledge, and the schemata and theorems that make up a theory’s consistency as a kind of vehicle that allows us to move considerately and actively in our thinking within networked and information saturated environments. (This understanding of theory has been characterized beautifully by Michel Serres in a.o. his text “Was Thales am Fuss der Pyramiden gesehen hat” (Hermes II: Interferenz, trans. Michael Bischoff (Berlin: Merve Verlag, 1992 [1972]), 212–39).
Information Technology is playing an increasingly important role in architectural design and thinking. While the design and conception space of computational methods open up a vast new horizon of what can possibly be done that has never been feasible before, we can observe a strange phenomenon among our students and within fellow communities: the greater and professional the skills in working with computers, the lesser the curiosity for individual processes of abstraction, formalization, problematization and solution-seeking. In other words: the better and assiduous the craft in scripting and parametrically twisting and tweeking the schemes and templates offered off-the-shelve, the lesser seem to be individual ambitions for autonomous thought and reasoning. To our great surprise and incomprehension, we experience strong indicators of devaluation regarding the cultural esteem for intellectuality. As if we were living in the end times of history. And this despite the obvious fact that all technologies supporting the common wealth today are nothing less and nothing more than the produces of intellectuality.
THE FOLLOWING INTELLECTUAL TRADITIONS AND LEGACIES ARE OF CENTRAL INTEREST
* the mathesis tradition, which understands mathematics as the art of learning
* the characteristica universalis tradition, which keeps the generative interplay between the symbolic elements (stoicheia) of forms, letters, and numbers open.
* the universal algebra tradition, which regards formulas as laws of conservation, and hence addresses everything that can be the object of thoughtful consideration in terms of saturation, differentiation, factorization rather than representation.
* the emerging tradition of contemporary mathematics in category theory, sheaf theory, topos theory, algebraic geometry, in its power of re-situating inherited notions of articulation and structuring in the philosophical, architectonic, sense.
New Materialisms | Swiss participation in the COST Action Networking European Scholarship on „How Matter Comes to Matter“
The applied virtuality theory lab is happy to coordinate the Swiss participation in COST Action ISCH COST Action IS1307 New Materialism: Networking European Scholarship on „How Matter Comes to Matter“
Overview
“New materialism is a relatively young epistemology, which has been developed in the humanities in order to reach out to the natural sciences, or bridge the two-culture divide. Its rationale pertains to the necessity, in the 21st century, to do interdisciplinary work (ecological crises, financial crisis, technologization of everyday life) and/or to find a label which may bring existing interdisciplinary work together for fruitful dialogue. The research questions that frame the action are:
- How to account for the metaphysical assumptions underlying our academic frameworks in order to better understand the processes of our knowledge production? How to create spaces in which the specific rules of different academic and non-academic practices of knowledge production can be made visible and thus negotiable.
- How to generate epistemologies that are equipped to do justice to the crises European (and non-European) societies face in the 21st century? How does a framework which is rigorous enough to be able to address economy and ecology, politics and technology, and the everyday look?
- How can practices of knowing be re-conceptualized and refigured in order to avoid a retrograde assignment of concepts to ‘material things’? (How) should these practices be institutionalized in European universities and among academics, policy makers and other stakeholders in Europe (and beyond)?”
Main objectives of the Action
The COST action New materialism: Networking European Scholarship on “How Matter Comes to Matter” aims to develop and interlink an emerging new materialist scholarship that uses matter and materiality as a focal point and search light in researching matters of ecology, science, technology, medicine, politics, and arts.
More info on the COST website: http://www.cost.eu/domains_actions/isch/Actions/IS1307
And the Memorandum of Understanding on Iris van Tuin’s Academia.net site.
applied virtuality book series | Sheaves, when things are whatever can be the case
This is a book that holds the intellectual wealth of our world to be elemental. Today, the classical architectonic elements of form, quantity, units, numbers, principles, foundations are all constituted by information, and by literacy. Artefacts are things whose nature consists in our intellectuality. Each one of them is engendered in its kind according to how we bundle our desires. For breath, for scope, for legitimation, for ease, for comfort, for challenge, for joy, for fear, for understanding, for communication, for sharing, for giving, for taking, for caring. Sheaves will not describe anything. It will not judge. It will bundle marks, left by acts of inception, the imprints of which we find in artifacts. There are no continuous texts in the book, but a wide range of topics are indexically concentrated so as to sheave the abounding substance of things-that-are-whatever-can-be-the-case, according to probabilities and their distributions. How to read this book? By taking its notions seriously. Search the internet, and they will lose their generalness. They will begin to speak to you, vividly. Bundle such riches with the riches of other notions, and they will activate each other. Take its pictures seriously, as well. Photograph or scan them. Use them as indexes while searching the internet. Again, you will find rich stories. Bundle those riches, concentrate them into new characterizations of identities that are interesting to you. Let yourself be inspired by the intellectual wealth of our world. You can expand it. It is an exciting adventure, demanding and optimistic.
Ludger Hovestadt is an architect and computer scientist, and, since 2000, Professor for computer-aided architectural design at the ETH Zurich. He is working at the borderline of calculability, and coined the terms “narrative infrastructures” and “serious story telling” to open up the manifold possibilities of information technology to architecture. He has founded several companies. The emphasis of his current research is on how in architecture, by a proper understanding of the nature of electricity and its relation to infrastructures, thinking might be prompted to shift from energy crisis to energy culture and its truly optimistic outlook.
Vera Bühlmann holds an MA in English Literature and Language Studies, and a PhD in media philosophy. She is founder and head of the laboratory for applied virtuality at CAAD ETH Zurich. Applied virtuality expresses the orientation of the theory lab towards how technics, artifice and literacy can constitute architectonic thought in a manner capable of cultivating the power of digital code, electricity, and information technology. Her interest is in thinking towards such architectonics in terms of a critical rationalism, mediated by an alphabetization of algebraic thingness, a “characteristica res generica”.
applied virtuality book series | EigenArchitecture, Computability as Literacy
Medializing the generic. A path out of the technological and economical excesses in contemporary architecture. A book on research and education in architecture and information technology, conceived of as philosophical interplay between two species similar in kind: neither of them is in the least disciplinal, both affect everything, and both are arts of structuring.The one 2,500 years old and dignified, the other just fifty years of age and impatient.
This book shifts the frame of reference for today’s network- and structure- oriented discussions from the applied computational tools of the twenti- eth century back to the abstractness of nineteenth–century mathematics. It rereads George Boole, Richard Dedekind, Hermann Grassmann, and Bernhard Riemann in a surprising manner. EigenArchitecture argues for a literacy of the digital, displacing the role of geometrical craftsmanship.
The book comprises a programmatic text on the role of technology in architecture, a philosophical text on the generic and on algebraic articulation, and seven exemplary projects by post-graduate students in 2012 at the Chair for Computer Aided Architectural Design at ETH Zurich, Switzerland.
applied virtuality book series | domesticating symbols (metalithicum II)
Technology is not simply technology, it changes character over time. We suggest there is a twin story to it. We call it metalithicum and postulate that it has always accompanied that of technology. It concerns the symbolics of the forms and schemes humans are applying for domesticating their environments. We assume that the protagonists of this twin story, the symbolics of those forms and schemes, are also not simply what they are but change character over time. From this perspective, Domesticating Symbols looks the substrate on which today’s data-processing-machines operate: information-technological media and apparatus no longer operate primarily based on the substrate of physical forces and their mechanical principles. Rather, their effectiveness deploys on a quasi-immaterial bed made of probabilistic signal horizons of symbolic codings, through which the erstwhile physical substrate is now formally getting rendered, as “data” in the sense of “informational constellation”. In this regard, it is important to stress that information technology today is no longer simply confined to elaborately controlling and investigating processes that may already be accessed through mechanical apparatus. Indeed, a movement is underway towards learning how to grant the energetic constitution of our world an own right, and form of address, amidst its mechanical constraints.
This book is the second volume based on the metalithicum colloquies organized once a year, where distinguished personalities from a broad range of architecture-related fields come together to discuss particular technological developments that are economically significant and philosophically interesting. The conferences are organized by the applied virtuality theory-lab at the Chair for Computer Aided Architectural Design, Swiss Federal Institute of Technology ETH Zürich.
EXCERPT FROM THE BOOK: THE INTRODUCTION
Pursuant to the first colloquy, about “printed physics”, the second one takes off from the rather clinical observation that the substrate on which today’s data-processing-machines operate has changed, not only quantitatively but also qualitatively, since we have learned how to deal technically with energy in electrical form; information-technological media and apparatus no longer operate primarily based on the substrate of physical forces and their mechanical principles. Rather, their effectiveness deploys on a quasi-immaterial bed made of probabilistic signal horizons of symbolic codings, through which the erstwhile physical substrate is now formally getting rendered, as “data” in the sense of “informational constellation”. In this regard, it is important to stress that information technology today is no longer simply confined to elaborately controlling and investigating processes that may already be accessed through mechanical apparatus. Indeed, a movement is underway towards learning how to grant the energetic constitution of our world an own right, and form of address, amidst its mechanical constraints. This form of right and this form of address ought to take into account that for the first time, photo-voltaics succeeds in harvesting energy in electrical form straight from sunlight, and, to boot, completely without recourse to any of the ever-dwindling tangible energy resources that our planet (still) holds in store.
Let us therefore abandon closed inter-process relationships of physical forces and power as a basic reference frame for the symbolic substrate of today’s information technology—let’s do so at least hypothetically and for the length of this colloquy. Information is not matter nor energy, as Norbert Wiener put it, more than half a century ago, although it is very likely that nothing further than transformation of energy-consuming appliances was yet foreseeable to him in terms of information technology. No later than with the latest such transformation, however, of harnessing infrastructures to IT-related energy production and distribution, new, application-oriented, pragmatical questions are popping up about what is calculable.
POPULATION DYNAMICS. FROM QUANTITIES TO QUALITIES.
Whereas the “symbols” handled by information technology are first taken in their formal-mathematical significance, i.e. no differently from the functional-equation systems of mechanical technics, the question of an adequate referential framework is arising with increasing intensity: of how to assess the capabilities and proficiencies that are acquirable in the potential-related spaces that are opening up pursuant to these information-technological transformations. Because symbolic coding allows the behaviour of mechanical relations of forces indeed also to be rendered and controlled. But at the same time, other effects-relationships, especially genuinely social inter-relations, can be symbolically represented and medially organized. Complete business models, of firms such as SAP od IBM, are based on it. Yet, a strictly formal-mathematical acceptation of said “symbols” suggests an approach to information technology that treats it as a continuation of mechanical technics—now however made analytically accessible, and therefore more powerful by magnitudes in relation to the possibilities, seemingly optimizable and automatizable, to be operationalized through the algebraic symbols. From this application-oriented angle one might, perhaps a tad provocatively, declare the computer, taken as a universal Turing machine, to be nothing more than just a geometrical machine, precisely because of the analytical notation standards, and at once in spite of them.
The Metalithicum colloquies start from the premise that the behaviour of processes media-ized by a form of technics the processing unit of which is information may not adequately be described through the physical-mechanics referential framework that is traditional to our understanding of technics. With the fantastic-programmatic term “metalithicum” we are reaching out for an abstraction to it. This neologism seeks to apprehend the before-said qualitative change of our relationship with stones and their geo-index, meta- standing for abstraction and –lithicum pointing to the various eras of the so-called stone age, in particular the neolithic age that denotes the neolithic revolution and the incipient settled way of life: domestication of nature. Because subjacent to the incipient operable availability of our gradually acquired techniques of symbolizing and operationalizing the relations the interplay of which makes up our world, there changes, along with the technical substratum, likewise the substrate of our existence. This substrate reveals itself today as some sort of “symbolized physics”, to which we can refer for the moment as media-ized nature—more for want of a better term than out of conviction.
The first, printed-physics, colloquy examined the technical functional principles of current information technology indirectly and projectively, through studying the conditions of their production-related (re-)producibility and the resulting economic availability. For illustrative purposes, we suggested a smilarity between the “print revolution” currently taking place in the production of information technology, and the situation surrounding the introduction of letterpress printing at the beginning of the modern age. Just as the then development contributed—in a manner probably unimaginable to Gutenberg’s contemporaries—to popularizing, and thereby secularizing, the so far largely monastically administered knowledge of that time, we suspect, in function of the presently ongoing mutations, the advent of similarly momentous consequences. In this context, it is important to remember that the current “print revolution” is, as already emphasized in early reflections, e.g. by Marshall McLuhan, first of all concerned with the availability of documents, i.e. of representing products such as books, periodicals, newsletters or e-mails, as well as radio and tv programmes. To be sure, on that level, too, drastic changes are underway. Much more fundamental appears, however, the said development where it concerns production of the appliances themselves that are operating on the physical level: from processors, storage media, aerials, amplifiers, LEDs in screens, lights, etc., up to photo-voltaics: all of them functioning machines that are produced in the shape of printed foils, on an industrial scale, and for quite some time now.
At least since the modern era, energy processes pass as universally describable regarding their principles, but locally situated as to their occurrences, and in this very respect as geo-metrically grounded. The mechanical principles of dealing with fire, e.g., or wind, water, or stone, therefore seem “natural” to us, even where their processes are susceptible of being set off by mechanical apparatuses, engines, or even seemingly self-driven automata, and in this sense of being caused; nevertheless, they remain largely reliant upon the channelling and conducting of physical forces by means of cultural techniques—and in this respect localized and grounded, too—, from the early forms of farming up to the most modern infrastructures of today’s power supply.
As we manage, however, to secure our power through electricity that we harvest straight from the currents of the sun, and make it available by means of information-technological networks, we convert these principles of localization and grounding of mechanical-geometrical cultural techniques into a principle of deterritorialization, such as is characteristic of digitalization in general. While we are familiar with modified accessibility and availability of digitized documents from the Internet, similar accessibility and availability might develop for the digitizing of apparatus, and indeed for power production.
The second Methalithicum colloquy, this one treating of “domesticating symbols”, addresses the question of our dealing with the near-ubiquitous pervasion of our everyday life by these patches of machine-made, but intellectually symbolized fabrics of effectiveness. Those first, merely quantitative dispositions will not fail soon to lead to the appearance of other qualities, too. What then, if these developments reach a certain level of saturation across populations, so that they cause new structures to take shape, to the point where we might, dimly, make out the conditio humana? We shall attempt to reflect upon the specific potential-related spaces that are presently accessible to us, along with the modified conditions of our speaking of, and dealing with, them.
CULTIVATION AND DOMESTICIZING STRATEGIES WITHIN THE SYMBOLIC
Today we are both capable and in the process of generating textures of effectiveness of sorts, in the form of printed foils, the physical-energetical behaviour of which we are able to code symbolically. We shall take this circumstance as a sort of model miniature for generating a thought draught of how to comprehend the generic-urban infrastructures from a distance, in their articulable symbolicalness instead of understanding them as a kind of scientific-secular nature.
The decisive turn in all this seems to us to lie in a strange self-referentiality that we experience between the technically formalizable and operationalizable, and a physically-understood naturalness, such as taken, at least since the modern age, as the referential plane for algebraic-functionally perceived symbols. It is in such physically-understood naturalness (the Cartesian res extensa) that we are used to root the relations facilitated by algebraic symbols, and into which we are used to place our apparatus, machines, engines, etc. Yet the locus proper to the Cartesian res cogitans, and especially the interrelation between both, has not ceased to puzzle the thinking of philosophers.
Kant still paraphrased a peculiar effectiveness not immediately derivable from that physical referential plane, as Geistertreiben, a ghosts’ race. While he was not to argue away the phenomenon of some effectiveness that were not primarily physically motivated and, in this sense, immaterial, even if it may unfold its effectivity in material terms, he kept it outside the scope of his concept of reason. This way, positivity gets shifted to centre stage of strictly scientific query and development; but its limits clearly come to light today, as this positivity is being externalized and logistically distributed in the by now dominant electrical-power and information technologies.
The rational-grid-based projection planes applied in functionally representing relations of effectiveness, which Kant used as the basis for his reflections about the workings of nature—those planes today actually still provide the framework for all manner of fantastic projections. We call such projecting “fantastic” because such projectional space has turned into topological space on behalf of associative networks. Upon the technical substrate of “information”, and the global netting that results, mechanical causality, as an explanatory principle for interrelations between territorial spaces, dissolves, within the logistical set-up of the territories, into probabilistics. Such probabilistics not only is no longer referable to any universal measure or metrics, but nor does it unfold on any foundation. It is at work outside of any geometrically-delineable space. Along with the actuality of probabilistic relations goes a new primacy of operable discreetness before the proportionally-conceived continuity of analogue recording and registering. To emphasize this aspect is important because the primacy of the discreet and articulable over the passively recordable also introduces a new power of expression into the formulation of problems—in both senses, mathematical and linguistic formulation. For not every possible solution is equivalent in its mightiness of accommodating complexity to other solutions that are equally possible. It seems we might perhaps best approach this situation by seeing in it the beginnings of a new “alphabetization”. Information science has already for long been speaking of “alphabets of coding”, yet this seems to be largely meant metaphorically. Still, are we not observing a new kind of sophistry, related to simulations and the techniques of imaging (bildgebende Verfahren) that are applied in computer-aided diagnostics and forecasting, as well as to techniques of data processing in general? If we take this observation seriously, and consider the birth of philosophy in Ancient Greece out of the tension between alphabetization and sophism, we may hypothesize interesting symmetries to our own times. To explore these symmetries, fantastically-projectively, will certainly not hand us any prognosis of what to expect. But it might help to develop a literacy in coding that were more adequate to the new alphabetization we seem to be experiencing.
THE CONTENT OF THIS BOOK
All the texts in this book centre upon the role of a coding and decoding practice that is capable of putting up such fantastic projection spaces. This practice is necessarily at work in all synthetic construction as well as in analytic deduction and integration—all notions of atoms, elements, characters, letters, numbers, etc., are necessarily symbolic in the form in which we deal with them. Otherwise we could have no systematic and precise concepts and ideas of them. Therefore it isn’t by accident that the reflections about domesticating the symbolic converge upon the most fundamental cultural areas: religion, politics, and scientific secularization; space, geometry, and law; spirituality, communication, and learning. The first article, by Georg Christoph Tholen, “Displacement, the Impossible Interlude between Man and Machine”, opens up by discussing why, in an era of digitalization, the question about the locus of technics and media must be re-situated. He makes clear why anthropological and instrumental categories—of technics as a means and instrument of already existent purposes (telos)—are unable to grasp the scope of the symbolic in its specificity, i.e. in its oscillating between form-giving and form-withdrawing. Tholen’s text takes stock of the current media-scientific discourse in its most prominent positions, and proposes a media metaphorology in order to incorporate the unique performance of the digital into the horizon of critical-analytical philosophy.
The second article, “A Fantastic Genealogy of the Articulable”, by Ludger Hovestadt, argues for understanding programming languages as a genuine and novel paradigm for language comprehension. He suggests to go beyond considering programming languages as merely encoded representations of formal (logical) languages, but to reformulate, on the strength of these languages’ performance, the question of the articulable—of what is at all sayable. Guided by considerations how this might be brought about, Hovestadt presents a fantastic genealogy of that which is articulable digitally, today. This will be followed, in a subsequent volume, by a schema by which we may learn how to comprehend the computable in the light of the articulable, and the articulable in the light of the computable.
The third contribution, Gregg Lambert’s “Two Images of Global Violence”, is about the linking up of technical terms with theology and religion, and addresses an—in his words—form of violence that is unparalleled in documented history, and of global dimensions. In the course of technically-induced globalization, a governing reality has arisen which we do not know how to deal with: global violence cannot be comprehended as strictly human, nor as divine or natural. Therefore, our thinking needs specifying, in accordance with economic, technical, political, cultural, and ideological mechanisms.
Vera Bühlmann’s “Arché, Arcanum, Articulation, the Universal and its Characteristics” takes in the same field, and discusses the stakes attaching to the philosophical notion of “universal”. Her text suggests comprehending the mathematical as a language in which the universal is characterizable for the very reason that nobody is native in it, i.e. at home in it as we are in our mother tongues. No one can learn to speak it without intellectual effort. To that end, it is important, as Bühlmann upholds, to situate the cultural technique of writing on a stage of abstract thought which can be set in different manners, in accordance with a particular technique’s algebraic-symbolical constitution. In pursuit of this movement, the finiteness of what is intellectually graspable grows richer in saturation as it is confronted by the inexhaustibility of that which could, virtually, be rendered lucid. The universal is no longer taken as a referential frame for that which is articulable, transmissible and communicable in writing, but, in its own right, as pre-specific genitality and fertility proper to the thinkable, within the horizon of which a strict separation of natural and artificial things is no longer sustainable. Thus, Platonic dialectics are being virtualized and thereby put in a position of approaching a geo-philosophical idea of wisdom, in which technics no longer is the transparently-set condition of the possibility of thinking, but might become thematic in its own right.
The fifth contribution, “Media Code, Dialogues on Digital Society”, by Christian Doelker, brings in some friendly distance, and the ground for some good-humoured self-irony. He advocates eluding, today more than ever, the ongoing demands for direct engagement, and the urgent prompts to which we are subjected by our continually being wired. As a means, he proposes recalling the wise men and women, and situations, that were, in the course of cultural history, already beset by questions similar to those that today are posturing as incomparably unique. His text comes in the form of the script of a fictitious radio-broadcast series, in which Ortega y Gasset, the philosopher, invites eminent figures of our intellectual history to a discussion about form, message, content, and code, and the alertness of mind that is anything but a matter of course. His guests are Marilyn Monroe, Berthe Morisot, Charles Darwin, John Amos Komenský, a.k.a. Comenius, und Plato.
The last article, “Mechanical Justice”, by Alexander Niggli and Luis Muskens, finally reflects upon the relationship between law and logic. “Are dealings with justice mechanizable?”, ask the authors. They hold the opinion that today this question urgently needs to be thematized. Because avoiding to engage in the problematic implications of possible jurisdictional mechanization and its inclusion within a critical framework, results inexorably in opening the door to quasi-standardizing of verdicts. It is, therefore, a matter of grasping the connection between logic and decisions, and of examining ways for tenable delegation to appropriate infrastructures, exactly in order to succeed in avoiding creeping, implicit de-facto mechanization. The authors consider an analogy between the Turing test—about computers’ faculty of imitating the process of thinking—and the mechanization of justice to be false, the real question being how to find algorithmic ways by which computers may mimic jurisdiction in sufficiently complex manner, and be trainable to decide certain cases in an acceptable fashion.
applied virtuality book series | printed physics (metalithicum I)
Technology is not simply technology, it changes character over time. We suggest there is a twin story to it. We call it metalithicum and postulate that it has always accompanied that of technology. It concerns the symbolics of the forms and schemes humans are applying for accommodating themselves within their environment. We assume that the protagonists of this twin story, the symbolics of those forms and schemes, are also not simply what they are but change character over time. From this perspective, Printed Physics looks at the technological and economical developments with which the physical characteristics of materials can be symbolically programmed, for example in the field of photovoltaics or in electronics more generally. These devices are being industrially produced by printing procedures. It is no exaggeration to call this a veritable printing revolution, although, unlike in Gutenbergs day, the printed products represent, primarily, their own functionality. They demonstrate what they can do in a technological, operative way.
This book is the first based on the metalithicum conferences organized twice a year, where distinguished personalities from a broad range of architecture-related fields come together to discuss particular technological developments that are economically significant and philosophically interesting. The conferences are organized by by the Laboratory of Applied Virtuality, at the Chair for Computer Aided Architectural Design, Swiss Federal Institute of Technology ETH Zürich.
EXCERPT FROM THE BOOK: THE INTRODUCTION
The topic Printed Physics takes as its starting point the phenom- ena observed in recent developments in information technology, by which materials can have their physical characteristics formally analysed, tech- nologically constructed and (bio-)chemically synthesized on a symbolic level, and—hence the wording of the title—can be produced industrially, using printing technologies. This manipulation of materials, specifically upgrading them so they become capable of information-technological programming and control functions, is called “doping”. Doped materials can be manufactured using a process that bears striking similarities to the printing technologies we are familiar with from the past. The manufac- ture of digital processors and memory chips for example is in fact reminis- cent of lithography and copper etching, and the chemical printing of pho- tographs, and thus comes to continue a line of earlier forms of analogue relief printing methods. In the case of printable solar cells, it can be said that instead of ink on paper, ions are literally being “imprinted” on sili- con. Yet there is one important difference, which becomes apparent in the respective notions of “imprinting” and “doping”. Unlike any other print product we have known before, this new printed matter plays a genuinely operational role, rather than a primarily descriptive or representational one. What we call printed physics actually refers to tiny electronic de- vices, produced and distributed on an industrial scale in processes that are akin to those used in the printing and distribution of newspapers.
From a philosophical perspective, there is something interesting that hap- pens in this printing of doped materials. The rationally defined grid, which has been crucial to us for deducing all our physical descriptions of nature, serves here as a frame for projecting the fantastical. Could it not be, is the question we would like to pose with this book, that we are witnessing a development in the field of physical thinking similar to the one that occurred with the logification in geometry in the early 19th century? Are we seeing the appearance of non-naturally determined physics as a comple- ment, as it were, to non-Euclidean, projective, algebraic geometry? As a context for our discussions in the Printed Physics Conference, we suggested a thought experiment: suppose a new enlightenment of physicalist and naturalized rationality and logic were to be announced, brought about and carried by the qualitative and quantitative impacts of the doping of materials and their production through print. How could this be argued?
A THOUGHT EXPERIMENT
For the purpose of this exercise, let us regard the pre-modern monasteries as proper “production plants” for the copying of the holy scriptures. The subsequent secular- ization of this process was brought about and carried by the qualitative and quantitative impact of printing, exemplified by the availability of “text” as a medium, the standardization of format and the freedom, for wide sections of society, to access what was once a ritual and sacred description of the world and to take an experimental approach to it. Astonishingly enough, in a reverse analogy, today’s factories, business- es and bureaucracies, with their modern industrialization of secular- ized rationalism, appear like “monasteries” for the copying of physical “regularities”. Systemically integrated into orders of higher and lesser function, and cybernetically implemented in multifarious information- technological infrastructures, these law-like regularities act as impulse generators for societal, scientific and economic processes. Our thought experiment suggests to test putting “functional infrastructures” in the place of the “holy scripture” in the pre-Gutenberg era, before the rise of modern, experimental science. The question that we would like to propose for consideration, not so much as a scientific or philosophical hypothesis than as our thought experiment, arises from this and goes as follows: if the printing press promoted the secularization of mental horizons in philosophy and modern science, is it not possible that these new printing technologies could bring about a further secularization; the secularization of a naturalized rationality principle? There are plenty of indications to suggest that contemporary production methods assert the same principles as were used back in the days of Gutenberg, but on a new plateau. We want to consider this as a plausible scenario by following two lines of argument, one qualitative and one quantitative.
THE QUALITATIVE ARGUMENT
It is neither a new nor a bold thesis, at this point, to posit that information technology is essentially concerned with symbolic operations, and that these sym- bolic operations—not only from a philosophical but also from a tech- nological perspective—cannot appropriately be reduced to the causal connections that are formulated in physics. Information is informa- tion, not matter or energy, as Norbert Wiener suggested more than half a century ago. The effectiveness of information technology does not develop on the same level as the effectiveness of heat, levers, gears or any other me- chanical device. Information technology controls the physical condi- tions symbolically. This means that information technology is operating on a different “substrate” from the physical-material technology of clearly identifiable cause and effect, its symbolicity turns it into a “me- dial” substrate—medial in the sense that it allows for different possible ways of operation.
A seemingly natural objection that might be raised at this point would perhaps be to regard the electric current as a type of “physicality”, and in so doing lend a familiar substratum in the traditional vein to the “void”of the symbolic. This does not, however, solve the problem of de- fining the relationship between electricity and symbols: quite the op- posite. To this day, there is no coherent proposal as to how we are to view electric currents from a physical perspective: should we regard its elements as fields, as waves, as particles or as impulses? The situation is no simpler from a philosophical perspective. All electronic technol- ogy is based on precisely that kind of algebraic analysis of symbolic op- erations which not only triggered, due to their genuine un-intuitivity and non-representationality, but also exacerbated what is known today as the foundation crisis of the sciences, around the turn of the 20th century. Nevertheless, we experience electric current on a daily basis as the ubiquitous availability of energy, as the potential for potential, so to speak. If one were to understand this availability not merely as a phenomenal characteristic that has emerged as an afterthought, but as a constitutive element of electricity, a rift would open up in the rela- tionship between symbolization and physics that is in no way inferior to the rift that exists between logical geometry a priori and geometric description of reality a posteriori.
THE QUANTITATIVE ARGUMENT
If, since its invention, information technology has primarily been used to refine the regulat- ing, switching and controlling operations of mechanical equipment, especially equipment that uses electric current, today a completely dif- ferent dimension of application is being defined. With electronic and information-technological processes, materials can be modified and synthesized not only in their qualitative properties, but also in their physical behaviour, which is to say in their temporal energetic constitu- tion. Artificial materials can now be produced by printing a synthesized ion and semiconductor structure (this is how silicon-based semicon- ductors are made, for example, but the principle is the same for organic carriers), and in combinations that are not familiar to us from nature. Photovoltaics are an example of this. They allow us to obtain energy directly from light without any combustion process and also without the interposition of other kinetic or dynamic transformers. It really is pos- sible here to speak of “symbolic physics”, not only because the “mechan- ics” are genuinely symbolically constructed (and not the inverse, which would have the symbolic structure derived from a “natural” mechanical context), but also on the basis of their industrial production processes, which today are rooted in information technology.
The principles of photovoltaics have been known for more than a cen- tury and yet they have only recently become a relevant component in the discussion about energy supply. It is only in the last few years that manufacturing techniques have become available that make the produc- tion of such programmed materials feasible on a large, industrial scale. It is now possible to produce them in printing processes that spew out physically functional apparatus, just like sheets of newsprint that come off the press at the New York Times. This goes hand in hand with a quan- titative pricing and production development that is characteristic of in- formation technology and is known as Moore’s Law: a doubling of the total amount of produced instances every 18 months, which results in a cost reduction of 30% per year. The quantitative line of argument assumes that every development requires a critical mass number of normalized instances in order to prevail. Such a large-scale fertile ground for these new dimensions of application in information technology has existed for just a few years now. Even so, with “smart” computer chips, this technology has rap- idly established itself as an omnipresent feature of our living environment. These chips potentially allow all electrical devices to behave as components of variably configurable systems. The quantitative distribution of systems-capable entities, in fact symbolic-capable actants, seems in a more serious way than just metaphorically comparable to its antecedent, the modern printing revolution.
This comparison may at first glance seem somewhat exaggerated, but as in the wake of the printing press, we too have experienced several abrupt advances that could not have been foreseen: for example, it took only 10 years for 5 out of 7 billion people—approximately 70% of the current world population—to have potential access to wireless connections via information technology. Today we can, at least potentially, phone one out of two people in the world, irrespective of where on the planet they happen to be at that particular moment. Setting up and establishing the infrastructure needed for mobile telephony took only a decade and yet it already feels commonplace to us today. If we want to assess the meaning of this quantitative argument adequately, we must keep in mind that this same technology would be nearly meaningless, and not just in the case of mobile telephony, if there were only a few thousand instances of it throughout the world. What makes it meaningful is that it has reached critical mass, and rapidly. And in this case too, the technology’s fast and wide propagation has its foundation in the exact same production and manufacturing processes: information-technological printing tech- nology. The same structural principle applies to the propagation of TV screens as much as it does to internet access, global positioning systems and the efficacy of Google: what would happen if Google could “only” link 1 million sites and only had 10,000 users that sporadically used its search engines? It would be virtually insignificant. Instead, it has now achieved a level of standardization that no longer just renders some qualities or aspects of our practices or behaviour less meaningful—a side effect of any standardization—but rather tips us “over the edge” into a situation where we are now developing new qualities on the very basis of this quantitative standardization.
THE CONTENTS OF THE BOOK
We sent this thought ex- periment along with our invitation to the speakers of the Printed Physics Conference in early summer 2010. Their contributions, however, represent their independently formulated positions, and only indirectly refer to our overall theme, mainly in the discussions that followed their lectures. In this book, we print the manuscripts of these lectures as dis- tinct chapters, and add brief summaries of the main lines of argument, as followed up in the discussions afterwards. These summaries take an indexing character; they are meant to provide a kind of conceptual mapping of the thematic landscapes through which we wandered, high-lighting the most important topical reference points that were raised.
In the first chapter, “A Fantastic Genealogy of the Printable”, Ludger Hovestadt presents the current innovations in electronic engineering devices, available today for architectural application and integration, on all levels from design to construction and planning. Furthermore, he provides, in a historical account of what he calls “serious storytelling”, a conceptual model for considering the specific potency of digital technology.
The second chapter, “Technology and Modality”, presents an article by Hans Poser who, as we should point out, was unable to at- tend the first conference in person, but has kindly allowed us to publish his article in our book. In it, he provides strong arguments as to why philosophy needs to pay attention to state-of-the-art technology when offering notions of reality and, directly related to that, notions of possibility and feasibility.
In the third chapter, “Primary Abundance, Urban Philosophy—Information and the Form of Actuality”, Vera Bühlmann suggests taking a capacity and capability oriented view of the study of information, and reflects on the intimate and co-constitutive relation between philosophical thought and an idea of citiness; her special emphasis thereby lies on the role of technology in that relationship.
The fourth chapter, entitled “That Centre-Point Thing—The Theory Model in Model Theory”, investigates the philosophical conditions for a ma- chine-based episteme. Klaus Wassermann argues that we are currently experiencing an historic movement that he calls “de-centrement”, and which he demarcates from the more common notions of decentraliza- tion and deterritorialization, in that this assumed turn towards ever- increasing de-centrement not only challenges any foundation, rules, structures, procedures and patterns that have served us so far for com- prehending the world, but most crucially also urges us to calibrate anew the role of the model itself in order to arrive at a philosophical notion of information.
In a fifth chapter entitled “Digital Cathedrals”, Helmut Geisert challenges the book’s emphasis on relating digital technology to earlier printing technology with a retro-projection of how, throughout the 19th and early 20th centuries, people had reflected on the relative cultural impact of Gutenberg’s printing revolution. The article reveals a relationship that is as fundamental as it is troubling, between materialist thought and the problem of how to establish notions of proportionality and appropriateness within architecture that has become secularized and profane.
In a final chapter, “Bringing and Positioning: Ways of Technology?” Hans-Dieter Bahr introduces Martin Heidegger’s thought on technology, and his postulated change in modern technology’s essential way of operating. By characterizing it as a “challenging- forth and ordering”, rather than as a “setting and maintaining-in-place”, Heidegger had introduced many of the core issues behind the conflict between support and control, which we are striving to come to terms with today with increasing urgency.
PHD Colloquy | Signification || Communication: theory and applications of glossematic coding as method for pre-specific modeling
«The entities of linguistic form are of “algebraic” nature and have no natural designation; they can therefore be designated arbitrarily in many different ways.»
(Louis Hjelmslev)
Since Claude Shannon‘s Mathematical Theory of Communication (1936), the notion of information in its technically treatable sense is often distinguished from its linguistic sense by ascribing to the former, as opposed to the latter, a purely quantitative treatment. Yet since the founding documents of a general linguistics in the late 19th century, it is clear for every linguist who affirms the break with the traditional way of studying language as philology, that the notion of the sign is to be treated purely quantitatively as well. Ferdinand de Saussure‘s structuralist paradigm for understanding processes of signification views the linguistic sign as a quantitative value, yet as a negative one which cannot, in itself, be positivized. As a negative value, it can only be specified by „profiling“ it through infinitary lists and their net of contrasts: a ,this‘ can never be signified directly, Saussure held, but only by labelling it as ,not-that‘ and not-that‘ and ,not-that‘ etc. In short, a linguistic sign can only be determined structurally and differentially, within a framework of place-value distributions.
From a logical point of view, de Saussure‘s paradigm of negative determination obviously entails problems regarding methodological feasibility, since it holds, by principle, that the necessary infinitary lists can never exhaustively be made explicit. This is the decisive reason why de Saussure himself considered his own structural approach, which attempted to conceive of language as a system, as having ultimately failed. Surely the post-structuralist critiques on such a notion of general linguistics are well known; yet from the point of view of algebraic computability (rather than that of logics), the situation looks different and is hardly explored today. Louis Hjelmslev is one of the very few linguists who continued the „differentiability within negativity“ approach initiated by de Saussure, by extending it mathematically. He considered Saussure‘s ,negative values‘ in a generalized sense as ,algebraic invariants‘. Like this, the structuralist paradigm is open for taking probabilistic procedures like Markov Chains and other algorithms, with which the diverse programming languages ordinarily work today, into account. From the logical point of view, this can hardly count as a forward pointing path, since it does not clarify how a notion of system could be objectified. Yet with regard to the logistic networks, such fixation is (arguably) neither necessary nor desireable. Here, Hjelmslev‘s algebraic approach offers a powerful alternative to the pre-dominant approaches in terms of semantic or object-oriented (informational) database logic and ontologies, because it is capable of abstracting from the distinction between natural language vs artifical/formal language and needs not subject one to the other: communication and signification can be treated as mutually complementary aspects.
In this kolloquium we will work through Hjelmselv‘s Prolegomena to a Theory of Language (1943), and appropriate it methods in practice. We want to explore if and how structural linguistics as glossematics (in the sense of Hjelmslev) can be extended towards an alphabet of things that were capable of integrating the operability of generative linguistics (Chomsky etc), and hence could provide a powerful method of pre-specific modeling.
Primary Readings:
Louis Hjelmslev, Prolegomena to a theory of language (1946).
Umberto Eco, A Theory of Semiotics (1976)Complementary Readings:
[1] Gilles Deleuze, “How Do We Recognise Structuralism?” in: Desert Islands and Other Texts 1953-1974.
[2] Alfred North Whitehead, „Preface“ in: A Treatiese of Universal Algebra (1898)
[3] Jean Baudrillard, The System of Objects (1968)
[4] Gilbert Simondon, On the Mode of Existence of Technical Objects (1980)SCHEDULE & TASKS
Tuesday September 24 2013
Introduction: the rise of linguistics amidst the competition between logics and universal algebra for a hegemonial position within an architectonics of communication: the structural paradigm for studying language. Mandatory Readings: [1] & [2]. Format: Lecture Vera Bühlmann
Tuesday October 1 2013
The General Criteria for a Theory of Language (part I): Hjelmslev chapters 1-6. These chapters present the ,specification‘ of glossematics. Task for this meeting: extract its parameters, principles, and hierarchy of principles. List them as if you were devising the description for a design task. Format: Presentations by students and discussion.
Tuesday October 8 2013
The General Criteria for a Theory of Language (part II): Hjelmslev chapters 7-10. These chapters present the horizon of application of the approach specified as glossematics. Task for this meeting: extract its outlooks, summarize them, make a list of possible objects of study. Format: Presentations by students and discussion
Tuesday October 15 2013
Glossematic Coding: Considering any artefact as ,text‘ in the glossematic sense (in terms of ,articulation‘
and ,partition‘). Format: Lecture Vera Bühlmann. Task to prepare for the next meeting: choose an artefact with which you will work in the following sessions. Begin to orgainze your understanding of it as a ,text‘.Tuesday October 22 2013
no meeting (Seminarweek)
Tuesday October 29 2013
The specifics of glossematic coding: Hjelmslev chapter 11. This chapter introduces the role of «functions» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday November 5 2013
The specifics of glossematic coding: Hjelmslev chapter 12. This chapter introduces the role of «signs and figurae» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday November 12 2013
The specifics of glossematic coding: Hjelmslev chapter 13. This chapter introduces the role of «expression and content» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday November 19 2013
The specifics of glossematic coding: Hjelmslev chapter 14. This chapter introduces the role of «invariants and variants» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday November 26 2013
The specifics of glossematic coding: Hjelmslev chapter 15. This chapter introduces the role of «linguistic schema and linguistic usage» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday December 3 2013
The specifics of glossematic coding: Hjelmslev chapter 16. This chapter introduces the role of «variants in the linguistic schema» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday December 10 2013
The specifics of glossematic coding: Hjelmslev chapter 17. This chapter introduces the role of «function and sum» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday December 17 2013
The specifics of glossematic coding: Hjelmslev chapter 18. This chapter introduces the role of «syncretism» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday January 7 2014
The specifics of glossematic coding: Hjelmslev chapter 19. This chapter introduces the role of «catalysis» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday January 14 2014
The specifics of glossematic coding: Hjelmslev chapter 20. This chapter introduces the role of «entities of the analysis» in glossematics. Task for this meeting: think about how you can apply this distinction in the codification of your artefact-as-glossematic text. Format: Presentations by students and discussion
Tuesday January 21 2014
The specifics of glossematic coding: Hjelmslev chapter 21-23. These chapters reflect, from the point of view of glossematics, about the «language-non language» distinction. Task for this meeting: think about what Hjelmslev discusses from the point of view of replacing the sound unit (phoneme) with information as a constitutive unit.Format: Presentations by students and discussion
PHD Colloquy | Information – in the light of the strange theory of light and matter (quantum electrodynamics)
According to Shannon & Weaver’s mathematical theory of information, information is strictly speaking neither a value (number) nor a magnitude (quantity), but it can be treated symbolically in terms of so-called random variables: values governed by chance. But how can we have a mathematical theory of information, then? For treating it mathematically, they postulated a proportionality framework within which information is to be measured: the more information in a system, they assume, the less strictly organized is the system, and the more uncertain they take the system’s behavior to unfold. With their notion of information entropy, Shannon & Weaver have set up a framework analog to the dynamic paradigm of thermal machines.Within this paradigm, the goal for communication processing is clear: to help reduce uncertainty by organizing a systems flows, transformations and exchanges more strictly.
What if we tried to view a mathematical theory of information within a quantum paradigm, rather than within a thermodynamical paradigm? The most important change as opposed to the thermodynamic paradigm is that the formalism’s capacity does not have principle boundaries by restricting it to the real number space: instead of focusing on the (representative) determination of random variables, at interest here is the (operative) articulation of path integrals. According to a quantum paradigm, we can deal with a decoupling and open ended paralleling of what (within the thermodynamic paradigm) needs to be nested within one comprehensive system: in a quantum paradigm we can deal with a stream of ‘data’ (1) and a formalism that captures quanta from this stream (proportional to its individual capacity to integrate) (2), and an act necessary for deciding when and with regard to what the formalism is to capture and integrate ‘stuff’ from the live stream (3). The goal in this paradigm is to increase the tolerance for a model to cope with uncertainty, not to decrease it. The outlook promised by this paradigm is that the model’s capacity to integrate probable behavior can be developed by training (e.g. with self-organizing maps SOMs).
The main reading of this Kolloquium is Richard P. Feynman’s QED, The strange theory of light and matter(Princeton University Press 1983).
Key notions to develop a clearer understanding:
integrals, path integrals, complex numbers, probability, probability amplitude, random variable, quantization, encoding, representation, rendering, live streams of data, mathematical structure, categories, classes
First meeting is on June 18th 2013
Meetings are held on Tuesdays, 9 am Swiss Time at the CAAD Chair in Zürich. If anyone is interested to join per Skype, you are welcome to email me.
Program & readings
Tuesday June 18 2013: Introduction to the program, the generic and the pre-specific
reading: Plato, Timeaus [1]
Tuesday June 25 2013: computability and probability
readings:
Stanley Burris, The Laws of Boole’s Thought. [2]
Walter Carnielli, Polynomizing: Logic Inference in Polynomial Format and the Legacy of Boole [3]
Claude Shannon, Mathematical Theory of Communication [4]
Tuesday July 2 2013
reading: Feynman, chapter 1 (Introduction) [5]
Tuesday July 16 2013
reading: Feynman, chapter 2 (Photons, particles of light) [5]
Tuesday July 23 2013
reading: Feynman, chapter 3 ( Electrons and their interactions) [5]
Tuesday July 30 2013
reading: Feynman: chapter 5 (loose ends) [5]
Tuesday August 6 2013: The quantum view vs the thermodynamic view
PHD Colloquy | Computability – considered in the light of the Master Argument (Diodorus Cronus)
Computation is largely treated today as the procedure to »mechanize« »logics«. Our interest with aprojective theory on technology is not to reject (negate) or affirm (analyse) the assumptions involved, but to sort them out strategically. Our interest is to complement the scientistic paradigm of »control« for theorizing technology with a humanistic dimension of ability and artistic mastership. This interest has a long tradition in philosophy, and crystallizes in the so-called Master Argument. The Master Argument regards the possibility if and how we can meaningfully and methodically involve temporalityand self-referentiality into logical/formal considerations. The inferential structure of the Master Argument has first been articulated by Diodorus Cronus in the 3rd century BC, and tries to formalize a paradox which has preoccupied all the main steps of development in systematical thought ever since. This is why the many attempts to formalize this paradox provide, for our projective theory interest, a rich and differentiated reflecting surface that allows to investigate, comparatistically, how these questions have been treated over time.
While the philosophical interest in the Master Argument was mainly in questions of legitimation and foundation, our interest in it is operational. We will not take, allegorically speaking, the position of the Despotic Priest, the Philosopher King, the Statesman or the Assigned Administrator, but that of the Symbolical Metallurgist. In short: we will seek to extract from the Master Argument and its history a template that allows us to cultive computing as an ability, namely the template of a mechanism for learning how to learn when being equiped with the generic methods of algebra.
We will read Jules Vuillemin‘s book Necessity and Contingency, The Master Argument (Center for the Study of Language and Information, Stanford University Press 1996). The historical account he gives is framed by the rôle of probabilistics for Information Science and Computing, and thereby especially relevant.
PHD Colloquy | Computation and the question of the applicability of arithmetics
Computation is treated today as an art, just as Mechanics had been in the Renaissance and the Baroque periods. This basically means that its actual performance is widely recognized and welcome, striking in effect, unexpected, fascinating and also convincing-by-fact, while at the same time the actual methods and procedures are applied rather like recipes. Over time, this gives rise to: 1) a lot of the same, boredom. And 2) to vast disputes around ancient questions on the rôle of technics in the nature of reasoning, intelligence, science. We want to gain a better insight about the modern theoretical context of these involved topoi, and will start with reading Michael Potter‘s introductory book to the main stancesReason‘s Nearest Kin –Philosophies of Arithmetics from Kant to Carnap, Oxford University Press 2002. Meetings are held on Wednesdays, 11 am (Swiss Time) via skype, between the CAAD Chair in Zürich and the NUS / ETHZ Future Cities Laboratory in Singapore.
Metalithicum Colloquy #4 Popularizing Insistencies
The fourth Colloquy, »popularizing insistencies«, was held February 1-3, 2013. From an architectonic perspective we will approach the topic of today‘s information based technology as a Gestalt of technology which renders functional whatever can be algebraically articulated. Let us, hypothetically, call the peculiar artefactual units that are articulated, by such double articulation, quantums of cityness. Let us further assume these units are to be – by conception – capable of articulating the consistencies within what structures globalized urban lives on any level, infra, media and superstructure. Let us further assume this quantum-elementarity of cityness as the complementary dimension to existentiality, and call it that of the insistential. How could a possible differentiation and reflection of the ability to articulate quantums of cityness within an assumed grammaticality be considered, via traditional structural forms, as something like the ability to index, catalogue, categorize, classify, and this – with the stress on the word ability – as something we can learn to achieve in an open degree of mastership, just like it is achievable in any other ability we can acquire and learn?
Introduction
Dr. phil. Vera Bühlmann, CAAD ETHZ
A Quantum City
Prof. Dr. Ludger Hovestadt, architecture und information science CAAD ETHZ
The Topos of Infrastructures in Modernity
Prof. Dr. Kate Marshall, media studies and literature, University of Notre Dame IN USA
The Technics of Prehension
Prof. Dr. Nathan Brown, literature and philosophy, UCLA Davis CA USA
Quantum Science and Metaphyics
Prof. Dr. Michael Epperson, philosophy, University of Sacramento CA USA
Characteristica Designata: Algebraic Articulation, Mathesis, Grammaticality
Dr. phil. Vera Bühlmann, media philosophy, CAAD ETHZ
On the Baroque Tsunami
Prof. Dr. Gregg Lambert, literature and philosophy, Syracuse University NY USA
Cityness beyond the Apocalyptic
Evan Calder Williams, philosophy, University of Torino IT
Temporality, Forms of Knowing, and the City
Prof. Dr. em. Werner Oechslin, history and theory of architecture ETHZ
Metalithicum Colloquy #3 Symbolizing Existence
The third colloquy, »symbolizing existence«, was held from May 20-22, 2011. Its point of departure was the current rapidly happening “deterritorialization“ of everything which was once regarded stable and binding in our cultural and intellectual history. What we today regard as statistically encoded information is capable to explicate and index the entire realm of what can be known, i.e. of being able to communicate and reproduce it, through a cascade of geometrical, functional, or finally logified schemas. The third Klausur asked how we might understand the ways of dealing with the informational double-articulation of things, ideas and modeling: once as something concrete and actual and once as a symbolic-formal template, sample or format. We are currently experiencing a rapid loss of “grounding” of that which we once considered binding in our cultural and intellectual history. How can we obtain, articulate, cultivate etc. a way of thinking about “instances” that does not fall back into a schematic model Platonism (thereby falling behind Plato), and that does not remain enmeshed in an Aristotelian realization dynamics with a naturalism organized by original genus, kinds, and specific marks of distinction?
The central phenomenon considered was the technological process of doping materials: At the quantum level, a particle or its representation, the point, is no longer „that which has no parts“ (Euclid).
Introduction
Dr. phil. Vera Bühlmann
Natur Entropie & Gestaltung
Prof. Dr. Christophe Girot, Landschaftsarchitektur ETHZ
Existenz symbolisieren? Zur Verschmelzung von
Wirklichkeit und Möglichkeit
Prof. Dr. Hans Poser, Professor em. für Philosophie an der TU Berlin
Präspezifisch. Zum Verhältnis von Information und Form
Dr. phil. Vera Bühlmann, Medienphilosophie CAAD ETHZ
Von pebbles zu digitalen Zeichen – die gemeinsame
Entstehung der Zeichen für Schrift und Zahlen, ihre
interkulturelle Standardisierung und ihre neuerliche
Vereinigung im digitalen Zeitalter
Dr. Gert Schubring, PD für Geschichte der Mathematik an der
Universität Bielefeld
Narrative Infrastrukturen und technisches Bündeln
Prof. Dr. Ludger Hovestadt, Architektur und Informatik CAAD ETHZ
Verteilung. Ein Versuch über die Körper der Form
Klaus Wassermann, Biologie, Informatik, Sprach- und
Erkenntnisphilosophie CAAD ETHZ
Artikulation und Resonanz. Virtuelle Topologien
ästhetischer Medialität in der Musik
Prof. Dr. phil. Michael Harenberg, Professor für Komposition und
Medienwissenschaft, Leiter des Studiengangs Musik und Medienkunst der
HdK Bern
Metalithicum Colloquy #2 Domesticating Symbols
The second Colloquy, »domesticating symbols« was held from November 14 -16, 2010. It looked at the entropic dissolution of symbolic structures we are experiencing today and explored various approaches towards learning to cultivate code. Genre-logical and language-historical backgrounds were considered regarding the modern transition from a primacy of thinking in terms of substance (Substanzlogik) to one of conceptual relationality in the determination of terms by functional mappings (Funktionslogik). The possibility of a further transition from the set-theoretical mappings by functions to a categorical kind of operationability – an ability in dealing with symbolic operations – was considered. At the center of the discussions was the question regarding the respective rôles of symbols, signs and numbers, as well as their respective relationships, in the cultivation of code.
The central phenomenon considered was photovoltaics and its capacity to capture energy by coding instead of exploitation, and of integrating energy into the ecosphere of the planet‘s natural balance in additional quantities, and furthermore, in the genuinely abstract form of electricity that allows to convert any form of energy into any other form.
Thema der zweiten Klausur: »domesticating symbols«
Nach der ersten Klausur zum Thema »printed physics« im Frühsommer 2010 findet die zweite Klausur »domesticating symbols« nun vom Freitag 12. November bis zum Sonntag 14.November statt, wiederum in der Stiftung Werner Oechslin in Einsiedeln. Dazu laden wir fünf Referenten zu Vortrag, Reise, Kost und Logis ein. Im Anschluss an die Klausur sollen die Beiträge in einem Buch veröffentlicht werden. Die Klausur ist nicht öffentlich, aber es wird eine beschränkte Anzahl an interessierten Gästen zur Teilnahme und Diskussion eingeladen.
Für unsere zweite Klausur »domesticating symbols« nehmen wir die erst einmal eher nüchterne Beobachtung zum Ausgangspunkt, dass das Substrat, auf dem datenverarbeitenden Maschinen heute operieren, sich nicht nur quantitativ sondern auch qualitativ verändert hat, seit wir mit Energie in Form von Elektrizität technisch umzugehen gelernt haben; informationstechnische Medien und Apparate operieren nicht mehr primär auf dem Substrat physikalischer Kräfte und deren mechanischen Prinzipien. Sondern sie entfalten ihre Wirksamkeit auf einem quasi-immateriellen Substrat bestehend aus den Signalhorizonten der symbolischen Kodierungen, mit denen das ehemalige physikalische Substrat nun als „Daten“ im Sinne von „informatorischer Konstellation“ formal abgebildet wird. Dabei ist es wichtig hervorzuheben, dass die Informationstechnik sich heute nicht mehr lediglich darauf beschränkt, diejenigen Prozesse auf verfeinerte Weise steuern und ausloten zu können, die sich uns auch schon mit mechanischen Apparaten erschlossen hatten. Vielmehr findet gegenwärtig eine Entwicklung statt, in deren Verlauf sich die energetischen Grundlagen unserer Welt selbst aus ihren mechanischen Constraints befreien könnten: Die Photovoltaik vermag es zum allerersten Mal, Energie in Form von Elektrizität direkt aus dem Licht der Sonne zu gewinnen, und zwar letztendlich komplett jenseits der immer knapper werdenden Ressourcen, die unser Planet in der Form materieller Energiespeicher bereit hält.
Als Referenzrahmen des symbolischen Substrats heutiger Informationstechnologie wollen wir deshalb, zumindest hypothetisch und während dieser Klausur, nicht mehr länger von den geschlossenen Kräfteverhältnissen physikalischer Prozesse ausgehen. Information is not matter nor energy, hatte Norbert Wiener vor mehr als einem halben Jahrhundert schon formuliert, obwohl für ihn wohl erst die Transformation von energieverbrauchenden Geräten in Informationstechnik absehbar gewesen war. Spätestens mit dieser neuesten Transformation aber, mit der Transformation der Infrastrukturen zur Produktion und Distribution von Energie in Informationstechnologie, stellen sich nun auch aus anwendungsorientierter, pragmatischer Perspektive die Fragen um das, was operationalisierbar ist, neu.
»Populationsdynamik. Von Quantitäten zu Qualitäten«
Zwar sind die „Symbole“, mit denen Informationstechnik umgeht, erst einmal in ihrem formal-mathematischen Sinn gemeint, nicht anders also als bei den Funktionsgleichungssystemen mechanischer Technik auch. Doch es drängt sich in zunehmender Weise die Frage nach einem adäquaten Referenzrahmen zur Einschätzung der Potentialräume ins Zentrum, die sich durch diese informationstechnologischen Transformationen eröffnen. Denn mittels symbolischer Kodierung lässt sich zwar auch das Verhalten der mechanischen Kräfteverhältnisse abbilden und steuern. Aber gleichzeitig lassen sich auch andere Wirkzusammenhänge, zumal genuin soziale Zusammenhänge, symbolisch darstellen und medial organisieren. Ganze Geschäftsmodelle von Firmen wie etwa SAP oder IBM beruhen darauf. Ein strikt formal-mathematischer Begriff der besagten „Symbole“ legt uns nun aber einen Umgang mit Informationstechnik nahe, der diese als Fortführung mechanischer Technik behandelt – jetzt einfach analytisch erschlossen und deshalb um Dimensionen wirkmächtiger in den Möglichkeiten dessen, was sich mit ihnen, scheinbar optimierbar und automatisierbar, operationalisieren lässt. Aus dieser anwendungsorientierten Perspektive lässt sich etwas provokativ vielleicht sagen, der Computer als Universelle Turingmaschine begriffen bleibe somit nichts anderes als eine geometrische Maschine, gerade wegen der analytischen Notationsstandards und gleichzeitig ihnen zum Trotz.
Die Metalithikum Klausuren nehmen zu ihrem Ausgangspunkt, dass sich das Verhalten informationstechnisch medialisierter Prozesse nicht hinlänglich über den für unser Verständnis von Technik traditionellen Referenzrahmen der physikalischen Mechanik beschreiben lässt. Mit dem phantastisch-programmatischen Begriff des Metalithikums suchen wir nach der Möglichkeit einer Abstraktion hierzu. Mit diesem Neologismus versuchen wir die besagte qualitative Veränderung unserer Beziehung zu den Steinen und ihrem Geo-Index zu fassen, mit meta‑ für Abstraktion und ‑lithikum für die Bezeichnung der diversen Epochen der sogenannten Steinzeit, mit besonderer Referenz auf das Neolithikum, das für die jungsteinzeitliche Revolution und den Beginn der Sesshaftigkeit, des Domestizierens von Natur steht. Denn hinter der neu aufscheinenden operablen Verfügbarkeit der Art und Weise, wie wir unsere Weltzusammenhänge zu symbolisieren und operationalisieren gelernt haben, verändert sich mit dem technischen Substrat auch das Substrat unserer Existenz in eine Art von „symbolisierter Physik“, die wir – vorläufig eher behelfsmässig denn aus Überzeugung – als medialisierte Natur bezeichnen wollen.
Die erste Klausur »printed physics« hat die technischen Funktionsprinzipien aktueller Informationstechnik indirekt und projektiv über ein Betrachten der Bedingungen ihrer produktionsbezogenen (Re)Produzierbarkeit und der ökonomischen Verfügbarkeit, die sich dadurch ergibt, in den Blick genommen. Zur Veranschaulichung haben wir eine Vergleichbarkeit zwischen der Situation einer sich gegenwärtig vollziehenden „Print Revolution“ in der Produktion von Informationstechnik mit der Situation der Einführung des Buchdruckes zu Beginn der Neuzeit nahegelegt. Hatte diese damals – auf eine für die Zeitgenossen Gutenbergs wohl unvorstellbare Weise – zu einer Popularisierung und damit zu einer Säkularisierung des weitgehend klösterlich verwalteten Wissens beigetragen, so vermuten wir hinsichtlich der gegenwärtigen Veränderungen durchaus kulturelle Konsequenzen von vergleichbaren Grössenordnungen. Dabei ist es wichtig, im Blick zu behalten, dass die besagte heutige „Print Revolution“ nicht – wie dies schon frühe Betrachtungen u.a. von Marshall McLuhan herausgestellt hatten – primär die Verfügbarkeit von Dokumenten, also von repräsentierenden, darstellenden Produkten wie Bücher, Zeitschriften, Newsletter oder Emails, wie auch Sendungen per Radio und Fernsehen betrifft. Zwar vollziehen sich auch auf dieser Ebene einschneidende Umbrüche; viel grundsätzlicher aber erscheint uns die besagte Entwicklung, sofern sie sich auf die Produktion von im Physikalischen operierender Apparate selbst bezieht: Von Prozessoren, Speichern, Antennen, Verstärkern, von den Licht-Emittern in Screens, Leuchten et cetera bis hin zur Photovoltaik werden die funktionierenden Maschinen als bedruckte Folien produziert, und dies inzwischen längst in industriellem Massstab.
Energetische Prozesse gelten uns mindestens seit der Neuzeit als in ihren Prinzipien universell beschreibbar, in ihren Ereignissen jedoch als lokal ver-ortet und gerade darin als geo-metrisch ge-erdet. Die mechanischen Prinzipien etwa zum Umgang mit Feuer oder Wind, Wasser oder Steinen gelten uns deswegen als „natürlich“, auch wenn sich ihre Prozesse durchaus auch von mechanischen Apparaten, von Motoren oder sogar von scheinbar selbst-tätigen Automaten gezielt auslösen und in diesem Sinn be-wirken lassen; sie blieben aber bisher weitgehend auf das Kanalisieren und Leiten der physikalischen Kräfte durch die Kulturtechniken angewiesen – und darin auch verortet und geerdet – von den frühen Weisen des Ackerbaus bis zu den modernen Infrastrukturen heutiger Energieversorgung.
Können wir unsere energetische Versorgung aber mit Elektrizität organisieren, die wir direkt aus dem Strömen der Sonne ernten und mittels informationstechnischer Netzwerke verfügbar machen, verändern sich diese Prinzipien der Verortung und der Erdung mechanisch-geometrischer Kulturtechniken in ein Prinzip der Deterritorialisierung, das für jedes Digitalisieren charakteristisch ist. Wir kennen die veränderte Zugänglichkeit und Verfügbarkeit von digitalisierten Dokumenten bereits aus dem Internet, und eine ähnliche Zugänglichkeit und Verfügbarkeit könnte sich auch für die Digitalisierung der Apparate, sowie der Produktion von Energie entwickeln.
Die zweite Metalithikum Klausur, nun zum Thema »domesticating symbols«, fragt danach, wie die Situation aufzunehmen wäre, dass diese maschinell produzierten, symbolisierten Wirksamkeitsgeflechte unseren Alltag nahezu ubiquitär durchdringen. Die zunächst rein quantitativen Entwicklungen werden wohl bald auch das Aufscheinen anderer Qualitäten mit sich bringen. Was, wenn sie eine gewisse Sättigung über die Populationen hinweg erreichen, so dass sich auf ihrer Basis neue Strukturen auszubilden beginnen, bis dorthin, wo wir opak die conditio humana vermuten können? Wir wollen versuchen, die spezifischen Potentialräume, die uns heute zugänglich werden, zu erwägen, sowie über die veränderten Bedingungen unseres Sprechens darüber und Umgangs damit nachzudenken.
»Kultivierungs- und Domestizierungsstrategien im Symbolischen«
Wir sind heute im Stand und mitten im Vollzug davon, eine Art von Wirksamkeitsgeflechten als bedruckte Folien herzustellen, dessen physikalisch-energetisches Verhalten wir symbolisch codieren können.
Die entscheidende Wendung dabei scheint uns darin zu bestehen, dass wir eine eigentümliche Selbstbezüglichkeit erleben zwischen dem technisch Formalisier- und Operationalisierbaren und einer physikalisch gefassten Natürlichkeit, wie sie uns spätestens seit der Neuzeit als Referenzebene für die algebraisch-funktional gefassten Symbole und die damit ermöglichten Zusammenhänge gegolten hatte, in die hinein wir unsere Apparate, Maschinen, Motoren etc. gestellt haben.
Noch Kant hatte eine Wirksamkeit, die sich nicht über diese physikalische Referenzebene herleiten liesse, als Geistertreiben umschrieben. Eine Wirksamkeit, die nicht primär physikalisch motiviert zu sein scheint, eine in diesem Sinn immaterielle Wirksamkeit, die sich aber dennoch materiell entfaltet, hat er zwar als Phänomen nicht etwa wegzureden versucht, aber er wollte sie aus dem Zuständigkeitsbereich seines Vernunftbegriffs ausgeklammert wissen. Die Grenzen einer damit ins Zentrum wissenschaftlichen Suchens und Entwickelns gestellten Positivität zeigen sich heute in deutlicher Weise, und finden ihre Veräusserlichung in der mittlerweile alles dominierenden Elektrizität und Informationstechnologie.
Die rationalen Raster als Projektionsflächen funktionaler Abbildungen von Wirkzusammenhängen, die Kant als Basis zur Reflexion über das damals noch auf erfolgsversprechende Weise geometrisch-mechanisch und quantitativ-linear zu beschreibende Naturgeschehen gegolten haben, wirken noch heute faktisch als Rahmen für praktisch alles phantastische Projizieren. Phantastisch nennen wir dieses Projizieren deshalb, weil dieser Projektionsraum zum topologischen Raum assoziativer Netzwerke geworden ist. Auf dem technischen Substrat von „Information“ und der daraus erwachsenden globalen Vernetzung löst sich die mechanische Kausalität als Erklärungsprinzip der Wirkzusammenhänge territorialer Räume sowohl für die materielle wie die soziale Logistik in eine Probabilistik auf, die als solche nicht nur auf kein universelles Mass und Metrik mehr bezogen werden kann. Sie entfaltet sich auch auf keinem Fundament mehr und ausserhalb eines Raumes, der über seine Grenzen bestimmt ist, also buchstäblich in deterritorialisierten Milieus. Worin liessen sich spezifische Kultivierungs- und Domestizierungsstrategien vermuten, die sich spezifisch für so etwas wie ein Konstruieren im Symbolischen herleiten und herausstellen liessen?
Zürich, Oktober 2010
Konstruieren im Symbolischen
Prof. Dr. Ludger Hovestadt, Architektur und Informatik CAAD ETHZ
[vimeo http://vimeo.com/47826934]
Polyloge zwischen den Zeiten
Prof. Dr. em. Christian Doelker, Medienpädagogik Uni ZH
[vimeo http://vimeo.com/47767018]
Zeichen-Szenarien. Sprachtheoretische und
gattungsgeschichtliche
Befunde zum menschlichen Zeichenvermögen
Prof. Dr. Ludwig Jäger, Semiotik und Deutsche Philologie,
Rheinisch-Westfälische technische Hochschule Aachen
[vimeo http://vimeo.com/47817266]
Philosophische Fragen um das Thema der Symbolisierung
aus rechtlicher Perspektive
Prof. Dr. Marcel Alexander Niggli, Strafrecht und Rechtsphilosophie
Universität Fribourg
[vimeo http://vimeo.com/47821914]
Digitale Disponibilität. Techné und Techniken des
Symbolischen
Prof. Dr. Georg Christoph Tholen, Medienphilosophie Universität Basel
[vimeo http://vimeo.com/47826346]
Symbole im Haushalt. Das Modell in der Architektur
Prof. Dr. em. Werner Oechslin, Geschichte und Theorie der Architektur
ETHZ
[vimeo http://vimeo.com/47828202]
Die Doppelartikulation einer digitalen Rhetorik
Dr. phil. des. Vera Bühlmann, Medienphilosophie CAAD ETHZ
[vimeo http://vimeo.com/47855312]
Metalithicum Colloquy #1 Printed Physics
The first colloquy, »printed physics«, was held from May 12- 14, 2010. It looked at the technological developments with which the physical characteristics of materials can be formally analyzed, technologically constructed and (bio-)chemically synthesized, and which – hence the wording in the title – are meanwhile being industrially produced and distributed worldwide: electronics. It is no exaggeration to call this an actual printing revolution, although unlike in Gutenberg’s day, the printed products do not stand for something specifically, their symbols do not represent, but instead they are functionally operative for articulating what is only generically determined, in a sense that we call pre-specific.
The central phenomenon considered was the production procedure of printing electronics and its economic context and prospect.
The challenges of interfacing technology to
everyday life
Prof. Dr. Ludger Hovestadt, Architektur und Informatik
CAAD ETHZ
Zur Geschichte der Denksysteme
Prof. Dr.em. Werner Oechslin Geschichte und Theorie der
Architektur ETHZ
Bringen und Stellen – Weisen der Technik?
Überlegungen zu Martin Heidegger
Prof. Dr. em. Hans-Dieter Bahr, Philosophie, Universität Wien
Zur Frage der Konstruktion im Symbolischen
Dr. phil. des.Vera Bühlmann, Medienphilosophie CAAD ETHZ
Die Geschichte mit dem Mittelpunkt. Das Modell der
Theorie in der Theorie des Modells
Klaus Wassermann, Biologie, Informatik, Sprach- und
Erkenntnisphilosophie CAAD ETHZ
Einige Gedanken zur Frage der Angemessenheit in
der Architektur
Helmut Geisert, Architekt und Publizist Berlin
Electricity as energy in its abstract form
THIS IS AN EXCERPT FROM: GENIUS PLANET, ENERGY SCARCITY TO ENERGY ABUNDANCE, A RADICAL PATHWAY, BY LUDGER HOVESTADT, VERA BÜHLMANN, SEBASTIAN MICHALE (publication is currently in preparation)
//
It does not necessarily seem all that intuitive, this notion of electricity as an “abstraction” of energy. What, you are entitled to ask, is an “abstraction” of energy? Is energy not either energy or no energy? How can you make energy abstract? And perhaps here we should qualify. Maybe “abstraction” is not so helpful a term. Because do we mean that energy becomes an idea of itself? That it can no longer do what it could do before it was “abstract”? No. What we mean is that it has become removed from its source and can now be dealt with in a non-material way. It has, effectively become a meta-form of itself.
Really? Well yes.
Take the log of wood. It’s a physical object, which we can stack up, chop in two, we can touch it, we can smell it. When we burn it, we know it’s burning, there is no doubt about it, it’s visible, tangible. In order to get it from the shed to the fire place, we have to go outside and lift it up and carry it in. If we want to make a big fire, big enough, say, to move a locomotive around a track, we don’t use wood, we use coal, which is fossilised, very old wood, and we burn that instead. Instead of going to the shed we go to the coal mine and dig it out from there. To get it from the coal mine to the train depot we load it onto lorries and trains.
How does this compare to an electric fire, or an electric train? What can you actually see, or touch here? Nothing. The electric “fire” is no fire at all, it’s a few metal bars that get red hot. So yes, if you touch them you’ll burn your fingers, but the bars aren’t burning, they just convert the energy back from electricity into heat. Or an electric train: we’ve once before remarked on the extraordinary fact that all it takes to get an electric train moving is a power cable and a pick up. And what’s so brilliant is that whether we want to light an electric fire in our living room, or drive a train from Hamburg to Milan: the way in which we now “package” or “handle” the energy is the same: electricity can do either, and a million things on top.
So perhaps what we should say is that electricity is energy once or twice or more often removed from its source. The source can be anything. If it’s coal, then you burn it to generate heat: the heat would be energy once removed from the coal. Turn that heat into steam and power a turbine, you get motion; the motion would be the energy once removed from the fire and twice removed from the coal. Neither the fire nor the motion though strike us as particularly abstract, and the reason they don’t strike us as very abstract is because they’re still visible, tangible and there are only so many things you can do with either. But the moment you remove the energy once more and turn it into electricity, you’re really doing something new and quite different with it. Because now you turn it into something that can be anything. A fire can’t be “anything”. A fire can only be hot and moderately bright. On its own, the fire can’t refrigerate your milk and it can’t store your photographs. It can’t suck the dust out of your carpet and it can’t ping you the football scores. It can’t get you to watch the moon landing and it can’t calculate the interest on your loan. Without somebody standing by and making smoke signals or blocking it off and then revealing it again at specific intervals it can’t even send you a short message. Actually, it can. By burning, say, on a mountain top, it could send a message to somebody on another mountain top. But that’s almost as far as it goes.
So what’s new isn’t that we can now, with electricity, imbue energy with meaning. We’ve been doing that for as long as we’ve been using energy. What is new is that up until the moment when we start to use electricity, the potential uses that we could put energy to, and therefore also the potential meanings we could associate with it, were determined by its own “type” or “form”. So, much as there is indeed a range, but a limited range, of uses we have for fire and a fairly wide, but still limited scope for forms that fire can take and the meanings it can have, once we turn the fire into electricity, the range of uses and the scope of meanings becomes limited primarily by what we are capable to articulate as an idea. We no longer need to talk of potential uses and meanings, we can simply talk of potentiality itself, the potential of potentials. Because many, you could say most, of the uses and meanings that we use electricity for could not even be imagined at the time electricity was first experimented with.
We can express this a little differently. We can say that up until the time, around now, over the last fifty, sixty years or so, when we began to really understand and make use of digital encoding and computing, the application of energy was always derived from its own “meaning”. “Meaning” to us, of course, but nevertheless “meaning” that was inherent in the energy itself. To stick with fire, because it is so elemental: the meaning that fire has to us is intrinsically linked to its own natural qualities, the way we experience them. We experience fire as “hot”, so it follows that fire to us means “warmth, comfort”. We experience that fire scares wild animals, so to us it means “safety”. We experience fire as lighting up the dark, so to us it means “light”. From these meanings we can derive uses that again follow “naturally”. Once you realise that cooked food tastes better and is more easily digestible than raw food, it makes sense to use fire to prepare food by cooking it. So close is the connection between the meaning and the use, that we consider them practically inseparable, and that too is no coincidence, because we most likely discovered the use by applying fire in its meaningful way, even if it was purely by accident.
With electricity, we are now able to put the energy to uses for which we may not even have a meaning to start with. The meaning may yet be invented or reveal itself. If you struggle with the thought of energy having any “meaning” at all, you may find it easier to think in terms of “form”. With electricity, it is no longer the case that the form that energy takes – such as fire – determines how we codify it. Instead, the way we codify energy – very specifically electricity – determines its application. You could say that the application is another, new “form” of the energy. And so you have, for the first time, a reverse relationship between energy, its form and its use. No longer does the form of the energy determine its uses, instead the use of the energy determines its form.
And it is in the word “codify” that you get a clue as to what we we’re getting at here: looked at in this way (and we’re aware this is one particular way of looking at it, this is not an absolute “truth” or and incontestable finding), energy could be said to be symbolic. Again, it doesn’t come so easily to us to think of energy as “symbolic”. How can energy be “symbolic”, you may ask, and it’s certainly true: the way we experience energy is not as “symbolic”. But that’s precisely the point. The way we experience energy is not “symbolic”, because to us energy is “real”. And that’s because whenever we come in contact with it, we do so when it has taken on a form that we can relate to again: heat, motion, light. We always experience energy as “real” and not as “symbolic” because there is no other way for it to manifest itself to us. Before we turn it into electricity it is either heat or motion or light or sound, and when we become aware of it again, we do so because we’ve turned it back into heat or motion or light or sound. And these are not abstractions nor are they symbols of energy, they are, to us, realities. So what about electricity? Is that not “real”? Of course it’s real enough, otherwise none of this would be happening, we wouldn’t be writing, you wouldn’t be reading, there would be nothing on TV tonight and nobody would be talking about an energy crisis.
But how would you describe it, as what? Well, exactly.
In a climate and environment of abundance, the critical questions regarding the reality of things are to be posed in terms of saturation (not representation)
Our friend, the electron
When we talk about “data”, and about “information” we strain to imagine what it is that actually happens with it in a digital network. We always see the result of data travelling through the network, because the computer goes “ping” and we can read an email that’s been sent, or we can watch a news item being streamed live while we’re at the station waiting for a train that’s been delayed, but it’s well nigh impossible to visualise, in your head, what actually happens in order for us to be able to do so.
That’s until you become friends with the electron. Being a subatomic particle with a mass of a bit more than one two thousandth of a proton, the electron is very small indeed. If you were to place an average size grapefruit next to an average size pea, you would have, in very broad approximation, the relative sizes of a proton and an electron. Except that inside the atom they would be rather further away from each other. To get an idea of how far away, put your grapefruit in the palm of Nelson in Trafalgar Square in London. You would then find the pea whizzing around the M25, which, if you don’t know London that well, is the suburban orbital motorway which encircles the city approximately 20 miles from the centre. Bearing in mind the pea size of our electron relative to the grapefruit nucleus in Nelson’s hand, we then have an atom the size of Greater London. But an atom is hardly the size of London. An atom itself is so small that one million atoms, lined up next to each other, make up about the thickness of the page you’re holding in your hand, if you are reading this book on paper. So you can see just how small an electron really is, it’s as big as a pea inside an atom as big as London, but it takes a million atoms to make up the thickness of a page in a book.
Tiny as it is, the electron, it is still negatively charged. In fact you may recall it has an elementary charge of -1. And that is why, in energy terms, the electron is our friend. Because being so small it can travel exceptionally fast, at nearly at the speed of light. And having an electrical charge, even a tiny one, it will change the state of any atom it happens to travel to, because most atoms most of the time are balanced in terms of their electromagnetic charge, which means they are neutral. (If that weren’t the case, you’d continually get small or large electric shocks when touching things. But as you know from experience, that only happens when you either touch something that has been deliberately electrified, such as a wire with a current running through it, or when you touch something that has got accidentally charged, such as a metallic surface that has been exposed to some friction, or the hand of somebody who’s been moving about a lot in a synthetic garment.) So because the “normal” state for most things when nothing is happening to them is neutral, each time an electron comes along, with its negative charge, it upsets things a little. The atom where the electron has arrived is now either negatively charged too, because the balance is out of kilter, or it, the atom, does something drastic, like expunge another of its electrons. It’s the expunging that’s the interesting bit, because the electrons that are already there really “want” to be there. They don’t “want” to leave. (The electron, with its minute size and subatomic nature, does not, we know, have a “will”, we are trading metaphors here…) So there’ll be a fair bit of jostling, before one of them goes, and that jostling is energy which registers as either heat or light or a magnetic field. And so although the dimensions are crazy, and there’s an awful lot of nothing in an atom, the electrons, with their diminutive size, have a fantastically big impact on them. They are a bit like the courier. We are oversimplifying things to some considerable extent now, but like a courier, they can carry a message (information) or a log of wood (energy). And the reason they can do either is because we have, about a hundred years after starting to make use of electricity as energy, found a way of using energy to symbolise information. We decreed that an electric charge should signify “on” or “yes” or “1” and that no electric charge should signify “off” or “no” or “0”. And we worked out that if you break things up into small enough units, you can codify any piece of information in precisely these two contrasting expressions: “on” or “off”, which is the same as “yes” or “no”, which is the same as “1” or “0”. (This, incidentally, covers only one technical aspect of digital encoding. In order to make information encodable, as it is today, developers and inventors had to break with century old traditions and invent a whole new algebra, as well as programming languages and bit-based binary code itself, which to go into would, at this stage, be stretching a little too far.)
The “digital” age was born, which is why to this day you see graphic designers illustrate all things to do with computers and information technology with cascading, travelling or otherwise moving ones and noughts. And this achievement, of using electricity to codify information has given us the ability to network information and deal with it, share it, distribute it, at the speed of light across the globe. But if we can “symbolise” information using electricity – which we patently can – then we can also “symbolise” electricity, using information. Now we no longer restrict ourselves to saying “electricity can carry information”, we can also say “information can guide electricity”. Because it clearly can: we don’t have to treat electricity as if it were a barrel of oil or a heap of coal. It isn’t. It’s an abstraction of a barrel of oil or a heap of coal, or anything else we like to use to “generate” it. We can treat it as such, we can treat it as information.
Symbolical metallurgy | substance as essential invariance that features in the structures formulated by the laws of conservation (Emmy Noether)
Digital architectonics, pre-specific modelling – what is at stake with these issues? We want to hold on to these terms because they promise to help finding an abstraction from the meanwhile classical distinction in cultural studies and anthropology between the Renaissance disegno tradition of how think about technics, and the contemporary interest from a media archeological angle in cultural techniques. But why seeking to find an abstraction from this distinction at all? While the former cannot seem to rid itself from an understanding of the artist as place holder of divine spirit on earth, the latter aggressively seeks to annihilate in its value for hominization the fact that there is something like richness in ésprit, wit, capacity in intellection (what in German is best captured in the adjective “geistreich“). The latter’s materialist point of view ought to be opened up by a contemporary atomist approach which conceives of the atom not as the basic common denominator that constitutes all things in their material terms of extension in space, time and action (matter and energy), but as the basic common denominator that factors in, symbolically (information), in the material constitution of all things. In short: the atom as that which can only be thought.
We take the digital to mean, in principle as well as in effect, (1) the primacy of relations before the functives (terms, relata) thereby related, and (2) an abundance of virtual relations over actual relations. Anything could be related to anything else. How to treat units, elements, proportions, ratios, similarities, how to gain orientations, stabilities in such climatic and environmental circumstances? How might we conceive of a digital architectonics?
Let’s follow the often forgotten idea of a general science of measuring, called posology, and relate this idea to that of a general science of “nacked” or “pure” quantities, that of a mathesis universalis. Posology has perhaps first been mentionend by Aristotle, and has been picked up in an interesting way lately by Gilles Deleuze (in his lecture on the Method of Dramatization). Instead of asking what-questions, in the sense of essence, posology asks what-quantity-questions first, before characterizing and classifying substances. Broadly speaking, posology is concerned with the question of dosage. Any question of dosage, different from those of measurement, is orientated within a variety of scales which can only in principle be fully determined, yet never actually, exhaustively, positively. Dosages are tested and fixed between maxima and minima – symbolical markings of virtual extremes – rather than on the basis of elementary givens. Nevertheless, such testing needs to be fully determinable in principle, and this is where the basic theorem of algebra becomes interesting: There are computable solutions to every problem. Not only on solution, and neither one absolutely optimal solutions, but whole ranges of possible ones. The challenge now shifts towards the articulation, the formulation and the formalization of what we regard as problematic. Such modelling, articulating, formulating, formalizing, computing and simulating differentiated perspectives on virtually possible solutions, this we call pre-specific modelling.
Like any modelling, also pre-specific modelling depends on partition and analysis, or to link it closer to our framework of a generalized posology: every dosage needs a unit according to which it can be measured, described, and passed on. Units are derived by setting some defined magnitude as elementary. Within architecture, the paradigmatic example would be the so called column module in the art of Greek Temple Building, from which the ratio is derived and declinated across scales to put the whole building into proportions. Today we are working with computers, where the elementary units are bits. Bits are the kind of units which render information into a technologically handable quantity. Usually, they are treated as code-based storage devices, of no theoretical interest in themselves and arbitrarily depending on different standards, such as 16 or 32 bit standards. The difference of these standards lie, mainly, in their capacities to store large numbers.
And yet, bits are literally speaking a very peculiar thing – looked at within the philosophical language game of quantities, they are of a strange kind indeed. Before the rise of symbolic algebra and set theory, every quantity has been treated, for more than two millennia, under the double aspect of magnitude (how much? how long? etc.) and mulititude (how many?). Each of those aspects depends on the determination of units, either for measuring or for counting. Units are derived by setting some defined magnitude as elementary. Within architecture, the paradigmatic example would be the so called column module in the art of Greek Temple Building, from which the ratio is derived and declinated across scales to put the whole building into proportions.
Now with information technology, we have bits as finite formal units (that’s why they can be treated technically, they are formal) – yet they are units of indefinite determinability. Indefinit is not the same as infinit, this is quite important to distinguish. While infinit would fit well within the elemental thinking of basics, primes, roots and kernels, the indefinit fits much better within a posological thinking of dosages. Hence we will call our digital bits Any-Units. Perhaps they can be conceived as Intensive Quantities. Perhaps, these are tentative ideas for now. In any case, they quite severly upset our notions of measuring and proportionality. In actual built or designed architecture, this makes for funny shapes and – as Etienne-Louis Boullée, the French Enlightenment Architect might have termed buildings like those of Frank Gehry or Zaha Hadid not without contempt – colossal volumes of gigantic, titanic and primary order. In less emphatic and more placid and general terms, they indicate at least a changed principle of availability through logistics. Digital logistics, so they make quite clear, comprehends much more than the infrastructures for physical mobility of transporting products, people, and structures. What used to be a model of something increasingly becomes applied as a model for something.
For a digital architectonis it still remains to be conceptualized how exactly the quantity-constitutive units of today, the informational bits, can be thought to help continuing the uncomparably rich and differentiated genealogy spanning from Euclid’s elementary geometry via Cartesian analytic geometry, the 17th century approaches to specious algebra and geometry, their concentration and radicalization in the Leibnizian idea of a characteristica universalis and an analysis situs, to Boole’s algebraization of classical syllogisms and their topoi to Turing’s mechanization of arithmetics in general. Surely, computed forms and structures that could not, by non-digital means, be drawn or calculated, ought to be put within this tradition.
What we call pre-specific modelling focuses on problems resolvable only from a stance within the Universability of Information. We thereby regard information not as quantum, nor as quantity, but as quantitability. While fully determinable, information is, consitutively so, indefinite. To make one step towards a digital proportionality we could say: information as quantitability is abstract but real just like the space of the real numbers is abstract but real. And as little as real numbers can be understood to represent natural numbers or integers, as little can information be understood as representing semantic denotation. Like the real numbers, information can be limited (or defined) by active determination only. In the case of information, such determination is instantiated by actively maintained integration, and differentiation of what the any-bits are supposed to articulate and organize. For just as we see them belonging to the philosophical language game of quantity, they surely belong to that of notation, script and media. Here, however, the any-bits can be conceived, actually, as the Leibnizian universal “characters” – with the significant difference, however, that universality to the algebraist today cannot mean a totality, an absolute term, any more. We have long started to inhabit universes of discourse, in the plural.
Pataphysics | continuing the idea of an experimental and empirical science of imaginary solutions, whose feasibility is rooted in algebraic symbols and code
Digital architectonics, hence, will focus on what aspects these considerations open up for better understanding the largely technical 20th century notion of information in it’s genuinely symbolic, algebraic make-up. How does the bit, as any-unit, belong into the genealogy of quantities, magnitudes, numbers, proportions, ratios? How does it fit within all that we know about treating known and unknown quantities, variable quantities i.e. any determinable indefinite quantity such as geometric summations or quadratures (integrals), or more radically and recently, since the rise of Dedekind’s conceptual number theory, the determinable indefinite quantities of algebraic kind, namely those determinable over the fields of numeric ideals instead of natural numbers or integers, when treated as terms within polynomial equation systems which always articulate wholeness by being genuinely dispars?
Perhaps a little example: say you put on a remote island in the middle of the ocean two shipwrecked sailors and they encounter there a fruit neither of them has ever seen or tasted before but that is delicious, nutritious and in plentiful supply. They will soon make up a word for it. The wongle fruit, as they might call it, because it might remind them of a wongle, which they’ll have a good laugh about, because only they will know what a wongle is, will soon acquire its own connotations and associations, and it will become part of their culture. Wongle juice, sea bass grilled and served with wongle, dried wongle and wongle powder will not be far off. And even if the two stranded sailors do not speak the same language and do not come from the same background, they are still capable of naming their fruit wongle or anything else they like the sound of, and the word may still become part of their little island culture. Until they are rescued by a passing vessel, take some wongle fruit with them and start an immensely successful wongle business, soon exporting wongle to every corner of the world, patenting their wongle soft drink and turning Wongle into a trademark and global brand. Because of its great health benefits, their ethical farming practices and a clever, sustained marketing campaign, Wongle may soon stand for something that is universally good and beneficial, and the word may become part of people’s vocabulary around the world. They may start using it to express something really wholly to be endorsed, and say things like “he’s a wongle of a man”, or “we’ve had a wongle holiday”… It may even turn into a verb: “I don’t know how she did it, but she’s certainly wongled it!” The humble wongle has become well and truly symbolised.
So, theoretical questions for pre-specific modelling involve, among others, questions like: what are we actually analyzing, when looking for classification in datasets? What orientates our expectations when extracting the co-called Proportional Means for clustering data? How can we make sense of vectors, feature vectors but especially also eigenvectors, eigenvectors-conceived-within-populations? Because eigenvectors each encapsulate – to put it somewhat drastically – the entire Universe of Discourse that is logically (inferentially) possible on what they index. Are they not encountering, with this capacity of demarcating potential coherency, an oxymoron-kind of concept, a kind of “quantitative concept” which has not yet been articulated and rendered into an actual, intuitive form and meaning? And what, more profoundly still, orientates our ideas thereof? And how can we learn to think about orientation as an ability, as the intellectual ability to maintain social, cultural, economical stability, self-composure and placidness?
There are two assumptions we hold crucial for such a perspective. The first is that we can become friends with the electron. And the second is that electricity is an intellectual abstraction of energy. We are living on and are part of a genius planet.
materiability research network
Seminar | materialism without territory. art and the environment.
The Chair for Computer Aided Architectural Design at ETH Zurich organizes the two-day seminar entitled: Materialism Without Territory. Art and the environment, which will take place on Friday 30th and Saturday 31st of May 2014 at Cabaret Voltaire, Zurich.
The event will bring together international lecturers from the field of architecture and philosophy to discuss issues on theory, practice and the literacy of computability in a format that is intended to give plenty of room for discussions.
The space for audience is limited, but the event is open for anyone who is interested, please refer to the events website for registration: http://materialismwithoutterritory.wordpress.com
FOCUS OF THE SEMINAR
The ancients distinguished contemplation (theory) as learning about that which never changes and which no human knowledge can influence, from the acquisition of skills (practice) as a kind of learning about that which constantly changes, and which human knowledge can influence. Practice was hence minorized as dealing only with the probable, while it was reserved for theory alone to address questions of truth. Against this legacy, much of modernity’s spirit of progressivism is driven by refuting the ancient minorization of practice in the light of theory, indeed by inverting it in strict favour of how knowledge can be made useful through systematic implementation: No mathematician shall pursue one of the classical mathematical problems (squaring the circle, or double-ing a cube), the French Academy famously proclaimed after the French Revolution.
The more recent emergence of two fields, both equally real as virtual, are suggesting that we take a fresh look at this legacy: (1) ecology and the environment as a way to frame how we refer to „nature“, and (2) computability as a way to operate intellectually on the templates of how we do frame „nature“ (implicitly or explicitly. Both, ecological thinking and the literacy of computability force us to think about systems in a way that can make no reference to a “natural state” for these systems – at least not without making metaphysical or ideological imports.
Thereby, there seems to be a wide common sense today that practice stands on the side of materialism, granting an undogmatic discourse because it centralizes quantitative aspects (magnitudes, and how they are calculated by computing multitudes) as opposed to formal ones (territories, and how they are governed by measuring). Theory, on the other hand, seems to stand on the side of idealism, prioritizing aspects of form. As we well know, either one of these intellectual “polar poles” claims to be prime in how we ought to speak about reality.
Hence the interest of this two day seminar: What is at stake with this competition between form and quantity, territory and magnitude? And how does the one notion so central to computing, namely that of information, fit within this context? Or in more general terms: How does the new literacy of computability fit within the old scheme that distinguishes theory from practice? Does this scheme at all remain to be relevant and informative in any kind of way in our contemporary cultural landscape? In which variant does it remain to be relevant (or where did it dissolve into if declared irrelevant?).
The two-day Seminar is organized by the Chair for Computer Aided Architectural Design at ETH Zürich, by Dr. phil. Vera Bühlmann and Jorge Orozco. The event is tailored for the Chair’s PhD candidates as well as MAS students, and the format is chosen such that there will be plenty of room for discussion (45 minutes paper, 75 minutes discussion per session). There will be a reader provided by the PhD candidates with a selection of texts they would like to be discussed in relation to the papers given.
PROGRAM
Friday May 29th 2014
10 am – 11 am
welcome and introduction by Vera Bühlmann, ETH Zurich, Switzerland
and
Jorge Orozco – Architecture and Indexes
ETH Zurich, Switzerland
11 am – 1pm
Ludger Hovestadt – The Composition of Zero. Why the mathematics of Leon Battista Alberti is not about data
ETH Zurich, Switzerland
2.30 pm – 4.30 pm
Gregg Lambert – How to create a territory with a work of art?
Syracuse University, USA
5 pm – 7 pm
Neil Leach – Adaptation
University of Southern California, USA
Saturday May 30th 2014
10 am – 12 am
Vera Bühlmann – The body of the cypher (an atomist view on computational entities that are generic in their kind)
ETH Zurich, Switzerland
2 pm – 4 pm
Iris van der Tuin – Pushing ‘generation’ to an extreme, and how genealogy is a geophilosophy
Utrecht University, Netherlands
4.30 pm – 6.30 pm
Sjoerd van Tuinen – The Cosmic Artisan
University of Rotterdam, Netherlands