Di seguito tutti gli interventi pubblicati sul sito, in ordine cronologico.
La datazione al radiocarbonio, nata quasi sessant’anni fa, ha rivoluzionato l’archeologia (ma non solo), perché ha permesso di stabilire l’età dei reperti di origine organica, come le ossa, il legno, la carta, i tessuti, ecc. Ora un team di ricercatori italiani dell’ Ino-Cnr di Firenze ha messo a punto una nuova tecnica dieci volte meno costosa dei metodi tradizionali e con un ingombro di cento volte inferiore.
Gli spettrometri di massa, usati generalmente per il calcolo della quantità residua di carbonio 14, sono infatti apparecchiature piuttosto imponenti e costose. Per analizzare accuratamente un reperto bisogna quindi chiedere l’aiuto ai grandi laboratori di fisica nucleare. La nuova strumentazione basata sulla luce laser infrarossa non è solo più economica e comoda, ma permette di misurare direttamente il numero di molecole che contengono atomi di radiocarbonio, facilitando enormemente il lavoro degli archeologi, medici e tecnici ambientali.
Nel tempo il carbonio viene assorbito dagli organismi viventi, fino alla morte. Da quel momento in poi la quantità di isotopo radioattivo diminuisce progressivamente. Per questo la sua misurazione fornisce abbastanza precisamente l’età dei reperti contenenti materiali di origine biologica.
Con l’analisi classica, però, bisogna bruciare un pezzetto di reperto per ottenere la molecola di anidride carbonica da cui viene estratto atomo per atomo il carbonio. I tradizionali spettrometri di massa devono essere molto sensibili, perché solo una molecola ogni mille miliardi contiene radiocarbonio invece di normale carbonio.
Un problema in parte risolto dai ricercatori italiani: “La nuova metodologia si basa su una tecnica spettroscopica ad altissima sensibilità, denominata Scar (saturated-absorption cavity ring-down)”, spiega Davide Mazzotti, coautore dello studio.
La soluzione è utilizzare la luce laser infrarossa, invisibile all’occhio umano ma assorbita dalle molecole. “La radiazione infrarossa viene riflessa tra due specchi tra i quali è contenuto il gas da analizzare. In questo modo la luce attraversa migliaia di volte le stesse molecole di anidride carbonica da misurare, che equivale a moltiplicare per migliaia di volte la quantità di molecole disponibili e ad aumentare così la ‘sensibilità’ di misura”, aggiunge il primo autore, Iacopo Galli.
A beneficiare del nuovo metodo non sarà solo l’archeologia, ma potenzialmente anche il monitoraggio dei cambiamenti climatici o dell’inquinamento, la ricerca medica e la rivelazione di sostanze tossiche o pericolose, passando per la sicurezza degli aeroporti e i test di fisica fondamentale.
Microorganisme din grupul mixogastridelor, forme de viata monocelulare care traiesc pe materie vegetala în descompunere si se hranesc cu bacterii, s-au dovedit capabile sa gasasca drumul cel mai scurt, printr-un labirirint, spre o sursa de hrana. Fenomenul, studiat de oamenii de stiinta de la mai multe insitutii de cercetare din Japonia, ar putea sta la baza proiectarii unor bio-computere capabile sa rezolve probleme complexe.
Mixogastridele, cultivate pe placi Petri, formeaza colonii si îsi organizeaza celulele în asa fel încât sa creeze cel mai scurt traseu spre sursa de hrana, care e relativ greu accesibila, fiind asezata pe mediul de cultura într-un labirint.
Savantii care au realizat experimentul considera ca microorganismele respective au o anumita capacitate de a prelucra informatiile, care le permite sa optimizeze traseul, facând colonia sa creasca pe directia cea mai avantajoasa, pentru a ajunge cât mai rapid la sursa de hrana.
În ciuda faptului ca nu au un sistem nervos central, capacitatea lor de a analiza posibilitatile si de a optimiza ruta se dovedeste superioara celei a unui computer, spun cercetatorii. Pentru computer, volumul de calcule necesar ar putea fi prea mare, însa un mixogastrid precum Physarum polycephalum (ale carui colonii arata ca un soi de pasta galbuie) poate gasi cea mai buna ruta într-un mod mai eficient.
Ca urmare, aceste forme de viata ar putea fi folosite pentru proiectarea rutelor optime în domeniul transporturilor.
Un alt domeniu în care ar putea fi utilizate este cel al neurologiei, unii oameni de stinta încercând sa înteleaga mecanismele complexe ale functionarii creierului uman, prin studierea capacitatilor de procesare a informatiilor la astfel de microorganisme si a modului în care acestea formeaza retele.
Sursa: AFP - via Descopera.ro
Simulating these fluffy puffs of water vapor is so computationally complex that even today's most powerful supercomputers, working at quadrillions of calculations per second, cannot accurately model them.
"Clouds modulate the climate. They reflect some sunlight back into space, which cools the Earth; but they can also act as a blanket and trap heat," says Michael Wehner, a climate scientist at the Lawrence Berkeley National Laboratory (Berkeley Lab). "Getting their effect on the climate system correct is critical to increasing confidence in projections of future climate change."
In order to build the break-through supercomputers scientists like Wehner need, researchers are looking to the world of consumer electronics like microwave ovens, cameras and cellphones, where everything from chips to batteries to software is optimized to the device's application. This co-design approach brings scientists and computer engineers into the supercomputer design process, so that systems are purpose-built for a scientific application, such as climate modeling, from the bottom up.
"Co-design allows us to design computers to answer specific questions, rather than limit our questions by available machines," says Wehner.
Co-design Test Case: Clouds
In a paper entitled "Hardware/Software Co-design of Global Cloud System Resolving Models," recently published in Advances in Modeling Earth Systems, Shalf, Wehner and coauthors argue that the scientific supercomputing community should take a cue from consumer electronics like smart phones and microwave ovens: Start with an application—like a climate model—and use that as a metric for successful hardware and software design.
The paper which uses the climate community's global cloud resolving models (GCRMs) as a case-study argues that an aggressive co-design approach to scientific computing could increase code efficiency and enable chip designers to optimize the trade-offs between energy efficiency, cost and application performance.
According to coauthor David Donofrio, a co-designed system for modeling climate would contain about 20 million cores (today's most powerful scientific cluster, Japan's 'K Computer' contains about 705,000 cores) and be capable of modeling climate 1,000 times faster than what is currently possible.
"Most importantly, the system would remain fully programmable so that scientific codes with similar hardware needs to the GCRMs, like seismic exploration, could also benefit from this machine," says Donofrio, a computer scientist at Berkeley Lab.
"Today when we purchase a general purpose supercomputer, it comes with a lot of operating system functions that science applications don't need. When you are worried about power, these codes can be very costly," says Shalf. "Instead of repurposing a chip designed for another market, the scientific HPC (high performance computing) community should specify what they want on a chip—the intellectual property (IP)—and only buy that.'"
According to Shalf, a co-designed system for modeling climate would use about one quarter to one tenth the energy required for a conventional supercomputer with the same capabilities.
Consumers Pave the Way for Next Generation Supercomputers
Although innovative for scientific supercomputing, the idea of application-driven design is not new. Electronics like cell-phones and toaster ovens are built of simpler embedded processor cores optimized for one or a few dedicated functions.
"Because the ultimate goal of the embedded market is to maximize battery life, these technologies have always been driven by maximizing performance-per-watt and minimizing cost. Application-driven design is the key to accomplishing this," says Shalf. "Today we look at the motherboard as a canvas for building a supercomputer, but in the embedded market the canvas is the chip."
He notes that the most expensive part of developing a computer chip is designing and validating all of the IP blocks that are placed on the chip. These IP blocks serve different functions, and in the embedded market vendors profit by licensing them out to various product makers. With an application in mind, manufacturers purchase IP block licenses and then work with a system integrator to assemble the different pieces on a chip.
"You can think of these IP blocks as Legos or components of a home entertainment system," says Donofrio. "Each block has a purpose, you can buy them separately, and connect them to achieve a desired result, like surround sound in your living room."
"The expensive part is designing and verifying the IP blocks, and not the cost of the chip. These IP blocks are commodities because the development costs are amortized across the many different licenses for different applications," says Shalf. "Just as the consumer electronics chip designers choose a set of processor characteristics appropriate to the device at hand, HPC designers should also be able to chose processor characteristics appropriate to a specific application or set of applications, like the climate community's global cloud resolving model."
He notes that the resulting machine, while remaining fully programmable, would achieve maximum performance on the targeted set of applications, which were used as the benchmarks in the co-design process. In this sense, Shalf notes that the co-designed machine is less general purpose than the typical supercomputer of today, but much of what is included in modern supercomputers is of little use to scientific computing anyway and so it just wastes power.
"Before this work, if someone asked me when the climate community would be able to compute kilometer scale climate simulations, I would have answered 'not in my lifetime,'" says Wehner. "Now, with this roadmap I think we could be resolving cloud systems within the next decade."
Although climate was the focus of this paper, Shalf notes that future co-design studies will explore whether this will also be cost-effective for other compute intensive sciences such as combustion research.
Source: PhysOrg - via ZeitNews
A New Form of Architecture
In 1948, the geodesic dome was far from the amazingly sophisticated structure it would become only a few years later. In fact, it consisted primarily of Bucky's idea and an enormous pile of calculations he had formulated.
Although Fuller was developing and studying the geodesic dome using small models, he was eager to expand his understanding through the construction of larger, more-practical projects. Thus, when he was invited to participate in the summer institute of the somewhat notorious Black Mountain College in the remote hills of North Carolina near Asheville, Fuller eagerly accepted. He had lectured at that rather unorthodox institution the previous year and had been so popular that he was asked back for the entire summer of 1948.
When he was not delivering lengthy thinking-out-loud lectures that summer, Fuller's primary concern was furthering an entirely new form of architecture. In his examination of traditional construction, he had discovered that most buildings focused on right-angle, squared configurations.
He understood that early human beings had developed that mode of construction without much thought by simply piling stone upon stone. Such a simplistic system was acceptable for small structures, but when architects continued mindlessly utilizing that same technique for large buildings, major problems arose. The primary issue created by merely stacking materials higher and higher is that taller walls require thicker and thicker base sections to support their upper sections. Some designers attempted to circumvent that issue by using external buttressing, which kept walls from crumbling under the weight of upper levels, but even buttressing limited the size.
Fuller found that the compression force (i.e., pushing down) that caused such failure in heavy walls was always balanced by an equal amount of tensional force (i.e., pulling, which in buildings is seen in the natural tendency of walls to arc outward) in the structure. In fact, he discovered that if tension and compression are not perfectly balanced in a structure, the building will collapse. He also found that builders were not employing the tensional forces available. Those forces are, instead, channeled into the ground, where solid foundations hold the compressional members, be they stones or steel beams, from being thrust outward by tension. Always seeking maximum efficiency, Fuller attempted to employ tensional forces in his new construction idea. The result was geodesic structures.
Because Bucky could not afford even the crude mechanical multiplier machines available during the late 1940s and was working with nothing but an adding machine, his first major dome required two years of calculations. With the help of a young assistant, Donald Richter, Fuller was, however, able to complete those calculations. Thus, he brought most of the material needed to construct the first geodesic dome to Black Mountain in the summer of 1948.
A Dymaxion House at The Henry Ford.
Disappointment before Success
His vision was of a 50-foot-diameter framework fabricated from lightweight aluminum, and, working with an austere budget, he had purchased a load of aluminum-alloy venetian-blind strips that he packed into the car for the trip to the college. Over the course of that summer, Bucky also procured other materials locally, but he was not completely satisfied with the dome's constituent elements, which were neither custom-designed for the project nor of the best materials. Still, with the help of his students, the revolutionary new dome was prepared for what was supposed to be a quick assembly in early September, just as the summer session was coming to an end.
The big day was dampened by a pouring rain. Nonetheless, Bucky and his team of assistants scurried around the field that had been chosen as the site of the event, preparing the sections of their dome for final assembly, while faculty and students stood under umbrellas, watching in anticipation from a nearby hillside. When the critical moment arrived, the final bolts were fastened and tension was applied to the structure, causing it to transform from a flat pile of components into the world's first large geodesic sphere. The spectators cheered, but their excitement lasted only an instant as the fragile dome almost immediately sagged in upon itself and collapsed, ending the project.
Although he must have been disappointed that day, Bucky's stoic New England character kept him from publicly acknowledging such emotion. Instead, he maintained that he had deliberately designed an extremely weak structure in order to determine the critical point at which it would collapse and that he had learned a great deal from the experiment. Certainly, the lessons learned from that episode were valuable, and his somewhat egocentric rationale was by no means a blatant lie. However, had he really been attempting to find the point of destruction, Bucky would have proceeded, as he did in later years, to add weights to the completed framework until it broke down.
In his haste to test his calculations, Fuller had proceeded without the finances necessary to acquire the best materials. Because of the use of substandard components, the dome was doomed to failure, and a demonstration of the geodesic dome's practical strength was condemned to wait another year.
During that year, Fuller taught at the Chicago Institute of Design. He and his Institute students also devoted a great deal of time to developing his new concepts. It was with the assistance of those design students that Fuller built a number of more successful dome models, each of which was more structurally sound than the previous one.
Then, when he was invited to return to Black Mountain College the following summer as dean of the Summer Institute, Fuller suggested that some of his best Chicago Institute students and their faculty accompany him, so that they could demonstrate the true potential of geodesic domes.
Having earned some substantial lecture fees during the previous year, Bucky was able to purchase the best of materials for his new Black Mountain dome. The project was a 14-foot-diameter hemisphere constructed of the finest aluminum aircraft tubing and covered with a vinyl-plastic skin. Completely erected within days after his arrival, that dome remained a stable fixture on the campus throughout the summer. To further prove the efficiency of the design to somewhat skeptical fellow instructors and students, Bucky and eight of his assistants daringly hung from the structure's framework, like children on a playground, immediately after its completion.
TO BE CONTINUED...
Internet è stata creata una volta, e può essere ricreata una seconda, anche meglio: una Rete parallela a quella esistente, indipendente, e a prova di censura e di controllo. Per ora si tratta solo di un’idea, anche perché non sembra di facile realizzazione. Il progetto però ha già un nome: The Darknet Project, abbreviato nell’acronimo Tdp.
Secondo quanto riporta Arstechnica, alcuni cyber attivisti si sono incontrati la scorsa settimana in una chat. Al momento il gruppo si coordina attraverso i social network su Reddit. L’obiettivo sembra proprio quello di creare una darknet (cioè una rete privata) globale. O, meglio, un insieme di darknet: una rete di network che funzionino indipendentemente gli uni dagli altri (e da internet), ma interconnessi in alcuni nodi. Evitare la centralizzazione è infatti una priorità: in questo modo, se un nodo o uno dei web venissero oscurati, il sistema resterebbe in piedi comunque.
"Fondamentalmente, lo scopo del Darknet Project è di creare un’internet alternativa, più libera, attraverso una rete mondiale di network”, ha sottolineato uno dei coordinatori di Tdp che si fa chiamare Wolfeater (divoratore di lupi). In che modo? Il primo passo sarà costruire delle reti locali e connetterle servendosi delle infrastrutture wireless già esistenti, finché delle nuove non saranno in grado di sostenere il sistema.
Il progetto è stato probabilmente ispirato dalla cosiddetta Operation Mesh lanciata dal gruppo Anonymous come risposta all’ Anti-Counterfeiting Trade Agreement (Acta), e che chiedeva un supporto per creare, appunto, un Web parallelo a quello esistente, magari utilizzando il software della rete anonima e criptata I2P e il protocollo Wrp del sito Batman. Al momento, dietro al Tdp non sembra esservi un gruppo in grado di coordinarsi allo stesso livello di Anonymous per portare avanti in modo concreto il progetto, fa notare Una . Non è escluso, però, che qualcuno in ascolto sul Web, e con le conoscenze e le capacità tecniche necessarie, risponda alla chiamata.
Anche perché l’idea circola da un po’. Era già balenata persino al governo statunitense, tra l’altro - riportava il New York Times lo scorso giugno - con l’obiettivo si sostenere i dissidenti dei paesi sotto un regime di censura e repressione. Dall’altra parte, progetti di software open source sostenuti da organizzazioni non profit avevano già captato nell’aria la crescente richiesta di tecnologie per mettere in piedi delle darknet. Uno di questi è Freenet, che vorrebbe realizzare darknet sulle infrastrutture internet esistenti. Un altro è Serval, che mira a creare reti indipendenti usando gli smartphone: ha già sviluppato un software prototipo per Android e cerca volontari per testarlo.
Strategies for reaching people in various activist groups. How to concentrate on what we all have in common, and how to avoid alienating these groups when spreading your message. And the necessity of the various groups (Occupy, Tea Party, etc.) to join forces against common foes to affect real change.
Free TZM Global Show Archives:
On Feb 1st 2012 at 4pm EDT Neil (A.K.A VTV) and Aaron (A.K.A StormCloudsGathering) will host this week's TZM Global Radio Show.
About : Started in 2009, TZM Global Radio is a weekly radio show presented by various coordinators/lecturers of The Zeitgeist Movement in a rotational fashion.
My name is Neil Kiernan, a.k.a VTV. V-RADIO is an activist radio show dedicated to spreading awareness of the need for mankind to focus on sustainability, equality, and peace. Topics frequently discussed on V-RADIO are: The Venus Project. The life's work of industrial designer and social engineer Jacque Fresco. The Zeitgeist Movement and the work of filmmaker Peter Joseph. Charlie Veitch and the love police. Ben Stewart of the Hanged Man Project. The "Occupy" movement particularly Occupy Detroit. Interviews with many other filmmakers, scientists and activists. V-RADIO is an effort funded by donations from it's listeners. If you would like to support V-RADIO click the "Donate" tab up at the top of this page.
If you are new to V-RADIO, you can check out the V-RADIO archives on one of the tabs above and get started. On the links page you will find links to my various social networking sites, (Facebook,MySpace, etc) and links to other radio shows dedicated to spreading awareness of this direction. Click on the "Must see TV" link for links to many great documentaries that are viewable for free on the internet that give more great information and insight into why we are doing what we are doing in the Zeitgeist movement. All of my shows are also on iTunes as well. And also, if you are here for Zeitgeist TV now you can view it and participate in the chat room right here on V-RADIO.org.
Thanks again to all of my supporters as you are the reason I am doing this.
"Împreuna cu aliatii, pe care SUA îi conving în prezent sa cumpere vase echipate cu sistem de lupta Aegis, potentialul total poate fi estimat în prezent la circa 1.000 de interceptori", a spus Rogozin, care este si reprezentant special al presedintelui rus în discutiile cu NATO.
El a avertizat ca cifra se apropie în prezent de limitele stabilite de acordul semnat recent de Rusia si SUA pentru reducerea armelor strategice.
Vicepremierul rus, Dmitri Rogozin. (ALEXANDER NEMENOV / AFP / Getty Images)
"Nu exista garantii ca, dupa finalizarea primei, celei de a doua, a treia faza (a scutului antiracheta al SUA - n.r.), nu va exista a patra, a cincea si a sasea. Chiar credeti ca îsi vor opri toate tehnologiile dupa 2020? Nu are sens! Vor merge înainte cu dezvoltarea si îmbunatatirea parametrilor tehnici ai interceptorilor si capacitatilor sistemelor de avertizare", a adaugat Rogozin.
El a subliniat ca interceptorii SUA acopera toata partea europeana a Rusiei, pâna la muntii Urali, si pot lovi atât rachetele cu raza scurta si medie de actiune ale unor tari precum Iran si Coreea de Nord, cât si rachetele balistice intercontinentale ruse.
"Faptul ca sistemul de aparare antiracheta poate lovi rachete strategice si faptul ca baze si flota sunt desfasurate în marile din nord demonstreaza (...) natura antirusa evidenta a sistemul antiracheta" american, considera el.
Sursa: epochtimes-romania.com - dupa un articol NewsIn
Desi avem o multime de prieteni pe Facebook, numarul oamenilor care ne sunt cu adevarat aproape atunci când avem nevoie este din ce în ce mai mic, avertizeaza Matthew Brashears, sociolog la Universitatea Cornell.
Brashears a intervievat mai bine de 2.000 de adulti si a constat ca, desi din 1985 pâna în 2010 oamenii au socializat mai mult ca niciodata, numarul prietenilor adevarati s-a diminuat.
Cercetarea a avut la baza un chestionar în care participantii erau rugati sa faca o lista cu numele persoanelor carora le-au marturisit ceva important în ultimele sase luni. Rezultatul studiului a indicat ca 48% din participanti au indicat o singura persoana, 18% au scris doua nume, 29% au mentionat mai mult de doua persoane, iar putin peste 4% nu au scris niciun nume.
În medie, participantii aveau 2.03 prieteni apropiati, comparativ cu 3, câti aveau într-un studiu similar efectuat în 1985.
Desi avem tendinta de a simti ca obtinem mai mult suport social din partea prietenilor din retelele de socializare, tindem din ce în ce mai mult sa ne limitam cercul de prieteni din lumea reala, afirma cercetatorul. Brashears spune ca acest fenomen nu ar trebui vazut ca unul negativ, din moment ce primim sustinere si sfaturi din partea prietenilor virtuali, doar ca nu ar trebui sa neglijam viata reala si importanta prietenilor adevarati.
Cercetatorul avertizeaza ca exista o diferenta între contactul online si cel personal. Astfel, chiar daca oamenii tind sa cunoasca mai multe persoane gratie Facebook si celorlalte retele sociale, nivelul de încredere scade, lucru indicat de faptul ca problemele cele mai intime sunt împartasite unui numar tot mai mic de prieteni.
Sursa: ABC News & descopera.ro
It is already used in Swiss elections to ensure that electronic vote data is securely transmitted to central locations. And as far as we know, no current quantum cryptographic system has been compromised in the field. This may be due to the work of security researchers who spend all their waking moments—and quite a lot of their non-waking moments—trying to pick the lock on quantum systems.
Their general approach can be summed up as follows: if you can fool a detector into thinking a classical light pulse is actually a quantum light pulse, then you might just be able to defeat a quantum cryptographic system. But even then the attack should fail, because quantum entangled states have statistics that cannot be achieved with classical light sources—by comparing statistics, you could unmask the deception. In the latest of a series of papers devoted to this topic, a group of researchers has now shown that the statistics can also be faked.
Quantum cryptography relies on the concept of entanglement. With entanglement, some statistical correlations are measured to be larger than those found in experiments based purely on classical physics. Cryptographic security works by using the correlations between entangled photons pairs to generate a common secret key. If an eavesdropper intercepts the quantum part of the signal, the statistics change, revealing the presence of an interloper.
But there's a catch here. I can make a classical signal that is perfectly correlated to any signal at all, provided I have time to measure said signal and replicate it appropriately. In other words, these statistical arguments only apply when there is no causal connection between the two measurements.
You might think that this makes intercepting the quantum goodness of a cryptographic system easy. But you would be wrong. When Eve intercepts the photons from the transmitting station run by Alice, she also destroys the photons. And even though she gets a result from her measurement, she cannot know the photons' full state. Thus, she cannot recreate, at the single photon level, a state that will ensure that Bob, at the receiving station, will observe identical measurements.
That is the theory anyway. But this is where the second loophole comes into play. We often assume that the detectors are actually detecting what we think they are detecting. In practice, there is no such thing as a single photon, single polarization detector. Instead, what we use is a filter that only allows a particular polarization of light to pass and an intensity detector to look for light. The filter doesn't care how many photons pass through, while the detector plays lots of games to try and be single photon sensitive when, ultimately, it is not.
It's this gap between theory and practice that allows a carefully manipulated classical light beam to fool a detector into reporting single photon clicks.
Since Eve has measured the polarization state of the photon, she knows what polarization state to set on her classical light pulse in order to fake Bob into recording the same measurement result. When Bob and Alice compare notes, they get the right answers and assume everything is on the up and up.
The researchers demonstrated that this attack succeeds with standard (but not commercial) quantum cryptography equipment under a range of different circumstances. In fact, they could make the setup outperform the quantum implementation for some particular settings.
The researchers also claim that this attack will be very difficult to detect, but I disagree. The attack depends on very carefully setting the power in the light beams so that only a single photodetector is triggered in Bob's apparatus. Within the detector, the light beam gets divided into two and then passed through polarization filters and detected. For a single photon beam, this doesn't matter—only one detector can click at any one time. But Eve's bright bunch of photons could set multiple detectors clicking at the same time. If you periodically remove filters, then Eve will inadvertently trigger more than a single photodiode, revealing her presence.
Source: Physical Review Letters, 2011, DOI: 10.1103/PhysRevLett.107.170404 - Ars Technica - via ZeitNews.org
Citation: The Futurist, Nov-Dec 1989 v23 n6 p14(5)
Title: The birth of the geodesic dome; how Bucky did it. (R. Buckminster Fuller)
Authors: Sieden, Lloyd Steven
Subjects: Geodesic domes_research & Dwellings_innovations
People: Fuller, R. Buckminster_innovations
Reference #: A8121293
Richard Buckminster Fuller, c. 1917.
Born: July 12, 1895
Milton, Massachusetts, United States
Died: July 1, 1983 (aged 87)
Los Angeles, United States
Occupation: designer, author, inventor
Spouse: Anne Fuller
Children 2: Allegra Fuller Snyder and Alexandra who died in childhood
Full Text COPYRIGHT World Future Society 1989
The Birth of The Geodesic Dome
Although Buckminster Fuller invariably maintained that he was a comprehensivist who was interested in almost everything, his life and work were dominated by a single issue: shelter and housing. Even as a young boy in the early 1900s, Fuller--who preferred to be called Bucky--was constructing rudimentary structures and inventing better "environment controlling artifacts."
The practical culmination of his quest to employ modern assembly-line manufacturing techniques and the best man-made materials in producing inexpensive, elegant housing came toward the end of World War II. At that time, government officials contracted Fuller to build two prototype Dymaxion Houses at the Beech Aircraft Company in Wichita, Kansas.
The lightweight, circular houses were praised by all who toured them. Because the Dymaxion House was to provide many new innovations at the very affordable suggested retail price of $6,500, orders flowed into the factory before plans for distribution were seriously considered. However, Fuller's interests were not geared toward practical matters such as financing and marketing, and the
Dymaxion House never advanced beyond the prototype stage. Fuller then moved on to consider other innovations that could benefit humanity in the areas of structure and housing.
He also returned to his less pragmatic quest to discover nature's coordinate system and employ that system in a structure that would, because it was based on natural rather than humanly developed principles, be extremely efficient. That structure is the geodesic dome, which, because it approximates a sphere, encloses much more space with far less material than conventional buildings.
In order to uncover nature's coordinate system, Fuller retreated from a great deal of his usual activities during 1947 and 1948. The primary focus of that retreat was a single topic: spherical geometry. He chose that area because he felt it would be most useful in further understanding the mathematics of engineering, in discovering nature's coordinate system, and eventually in building the spherical structures that he found to be the most efficient means of construction.
Having observed the problems inherent in conventional construction techniques (as opposed to the ease with which nature's structures are erected) and the indigenous strength of natural structures, Fuller felt certain that he could perfect an analogous, efficient, spherical-construction technique. He was also aware that any such method would have to be predicated upon spherical trigonometry. To do that, Bucky converted the small Long Island apartment that his wife, Anne, had rented into a combination workshop and classroom where he studied and discussed his ideas with others.
As those ideas started to take shape in the models and drawings he used for sharing his insights, Fuller considered names for his invention. He selected "geodesic dome" because the sections or arcs of great circles (i.e., the shortest distance between two points on a sphere) are called geodesics, a term derived from the Greek word meaning "earth-dividing." His initial dome models were nothing more than spheres or sections of spheres constructed from crisscrossing curved pieces of material (each of which represented an arc of a great circle) that formed triangles. Later, he expanded the concept and formed the curved pieces into even more complex structures such as tetrahedrons or octahedrons, which were then joined to create a spherical structure. Still, the simple triangulation of struts remained, as did the initial name of the invention.
Although Fuller's study of mathematics played a significant role in his invention of the geodesic dome, that process was also greatly influenced by his earlier extensive examination of and work within the field of construction. During his construction experience, he came to realize that the dome pattern had been employed, to some extent, ever since humans began building structures. Early sailors landing upon foreign shores and requiring immediate shelter would simply upend their ships, creating an arched shelter similar to a dome.
Land-dwelling societies copied that structure by locating a small clearing surrounded by young saplings and bending those uncut trees inward to form a dome that they covered with animal skins, thatch, or other materials. Over time, that structure developed into the classic yurt that still provides viable homes for many people in and around Afghanistan and the plains of the Soviet Union.
TO BE CONTINUED...