How does an idea go through a tool to become a project
Thinking is the most complex cognitive mental process, which consists of reflecting essential properties, features and regularities from reality. Through this process we can create models of the world and represent it according to certain objectives, intentions and desires. The main forms of thinking are observation, analysis and synthesis (rational discourse).
In those three forms certain knowledge also goes through different kinds of consciousness or media and with every step it is abstracted and therefore gains a different kind of meaning.
In the paper Entering a Risky Territory: Space in the Age of Digital Navigation Bruno Latour1 analyses the evolution of mapping and it’s radical transformation since the digital age. The reason for depiction, the means of gathering data and their representation has developed over time parallel to our society. Typically, following a scientific discovery, the newly aquired knowledge shifts the general understanding of our place in the world and artists are fast to grasp those new notions and interpret them subjectively. Later that change further reflects also on our everyday life. So if in his work Bruno Latour is focusing on the shift of pre-digital and post-digital age and the different notions it gives, the text The New Plasticity by Sanford Kwinter2 from 1986 follows much deeper the transition of knowledge from one area to another. The process starts with the publishing of the relativity theory and the revolution it caused in terms of new understanding of time and space followed by the interpretation in the arts, described by the paintings of Boccioni and the Futurists, and ends in architecture with the project for La Citta Nuova by Antonio Sant’ Elia and his visions. In both texts, as well as in many other interpretations, it is easy to follow the tendency of knowledge to morph its concept as it transitions through different medium. As Marchall McLuhan famously concluded – The media is the message. However, this conclusion leads to the question – now that we are equipped with such powerful tools, tools which are creating themselves just as actively as we are, how does that change our designs and understanding of our environment?
Up until the early 20th century it was the Aristotelian logic that prevailed the world of philosophy – basically the logic of non-contradiction (that no statement is both true and false). Little changed until the 19th century when Leibniz3 proposed a symbolic logic that would reduce reasoning to a kind of calculation and sequentially until George Boole4 coins the so-called “Boolean logic”, which changes everything and initiates a chain of discoveries leading to the birth of modern computing.
In the 1950s and 1960s, researchers predicted that when human knowledge could be expressed using logic with mathematical notation, it would be possible to create a machine that reasons, or artificial intelligence. In this way, a thought is put in a different frame and then itself computed : an idea starts in one system (our brain), develops in another (our tool), which takes its own decisions based on the instructions we have provided and the product is the result of the interplay of these two systems. It is exactly at the stage where a decision made by one medium (human) is communicated to another (computer) that the key characteristic of the transformation is formed. Just as in logic, in order for a designer to communicate his idea with a computer, his logical theorem has to be expressed as a mathematical theorem and proposed in a series of algorithmic actions.
Interesting is a comparison we can already witness – the idea of self-organization processes have inspired architects and urban planners for centuries. Just as the study of logic suggests, the notion of knowledge and what is knowable goes in the 3 steps, namely: observation (self- organization processes are most notable in nature), analysis (systematic study of the natural precedents) and synthesis (artificial simulation). In architecture it initially started as pure imitation of the form, then slowly the strategy changed and ideal became to imitate the process which formed the shape in the first place. That means that the attitude changed – where Frei Otto would build a mock-up model of the geometry in an analogue way and then build the exact replica of what his studies have concluded, designers nowadays go through a series of iterations before reaching the final product – for example the Taichung Metropolitan Opera House by Toyo Ito, based on a minimal surface logic. Where all form, structure and space are subordinate to the result of the study (the structure of the Munich Olimpia Stadium by Frei Otto follows precisely the lines of the chains in the hanging chain model, the forms and the spaces provided are exact replica of what resulted from the cloth model for the Multihalle Mannheim), the newer projects are result of series of iterations and spaces are specially designed according to the local needs, but on the other hand the algorithm should stay true, so iterations are possible only within its definition. Another meaningful difference is found at the stage in which the project jumps from one medium to another. Where the Frei Otto’s projects stay virtual as ideas, then are transferred into drawings and very fast brought in the real world by building mock-up models, modern projects already go through a various sets of moderation still in the virtual areal. Rapid prototyping now seeks to enhance the direct link between computer and physical reality, so the process of stepping from one medium to another happens directly, but scale remains a challenge.
Since the tool becomes a separate independent system (artificial intelligence), ruled by algorithms and conditionals, the product is shaped by both our intentions and the tool’s active decisions. That of course would also change the process itself once more – one would have to go through many more stages of iteration and experiment. Speculating would become a big part of the process – designing by manipulating, a much more curious and intuitive process. Due to the much faster rate of conceptualizing, formalizing and prototyping, our ideas will be much faster brought into daylight and their evolution will be accelerated. Learning by doing will take an even stronger point and customization will bring even more dynamic to our life. But on the other hand the rate in which the physical reality develops would never be able to catch up with the developments in the virtual reality. There can never be prototyping urbanism. Therefore the problem of architecture having to predict the needs of society, due to the slow process of building becomes even more severe now that virtual development and with it also social development happen so much faster than building in the real world. Taking the abstract path of the computer leads us fast to areas untouchable by the physical world and the other way around. Virtual and real can never meet in the time schedule, although they can and should very well supplement each other in the process.
1 Cartography in the age of digital reproduction – Valerie November, Eduardo Camacho-Huebner, Bruno Latour
2 La Citta Nuova : Modernity and Continuity, Sanford Kwinter, Zone 1-2 (New York: Urzone, 1986)
3Gottfried Wilhelm von Leibniz(1646 – 1716) – German mathematician and philosopher, greatly known for his Law of Continuity and Transcendental Law of Homogeneity and also for refining the binary number system.