Immagine
 Trilingual World Observatory: italiano, english, română. GLOBAL NEWS & more... di Redazione
   
 
Di seguito tutti gli interventi pubblicati sul sito, in ordine cronologico.
 
 

Disse bene Eraclito, più di duemila anni fa. “Nessuno può fare il bagno due volte nello stesso fiume: non sarà più lo stesso fiume e non sarà più la stessa persona”. Il tempo fugge e ci sfugge, inesorabilmente. La sua stessa essenza resta un mistero. Se basta filosofeggiare un po’ sul tempo per avere mal di testa, immaginiamoci l’effetto di misurarsi con la quarta dimensione nel Cosmo, dove le leggi della fisica cambiano e succedono cose ben più strampalate di quanto l’esperienza sensibile aiuti ad afferrare. Ne sa qualcosa Jean-Pierre Luminet, astrofisico di fama internazionale, grande esperto di buchi neri, nonché prolifico scrittore, poeta, divulgatore scientifico e musicista: un tipo poliedrico, insomma, almeno quanto il modello cosmologico a forma di caleidoscopio che ha elaborato e battezzato, con un’iperbole linguistica, l' Universo stropicciato. Sua la lectio magistralis (presso l’Auditorium Parco della Musica di Roma) per inaugurare la settima edizione del Festival delle Scienze 2012, dedicata quest’anno a “ ciò che accade quando non accade nient’altro”, direbbe Richard Feynman. Alias: il tempo. 

“Prima era assoluto, identico in ogni sistema di riferimento. Poi arrivò Einstein, e il tempo diventò elastico, relativo, indissolubilmente legato alle tre dimensioni nello spazio-tempo, piegato dalla distribuzione della materia e dell’energia”, spiega Luminet, attualmente direttore del Cnrs francese. “Questo ha aperto aspetti affascinanti, come il paradosso dei gemelli e la possibilità di congelare il tempo nei buchi neri”. Chi non ha mai fantasticato di poter fermare le lancette e dilatare il tempo, come in Alice nel paese delle meraviglie? Nello Spazio questo è possibile (anche se le condizioni sarebbero decisamente estreme per apprezzarne i vantaggi).

Chiarisce lo scienziato: “ Nei forti campi gravitazionali, come quelli generati dai buchi neri, il tempo apparente, quello misurato da un orologio fermo, è molto diverso dal tempo reale, quello misurato da un orologio in caduta dentro il buco nero.  È il motivo per cui il tempo apparente può essere congelato. In altre parole, un ipotetico osservatore esterno non vedrebbe mai nel suo futuro l’oggetto che cade nel buco nero, sebbene questo sia in realtà scomparso in pochi istanti”.

Hard? Che dire, allora, dell’origine del tempo? Esisteva prima del Big Bang? “Nel modello classico il tempo ha un punto d’inizio”, prosegue Luminet, autore tra gli altri libri de L’invenzione del Big Bang. Storia dell’origine dell’Universo (Dedalo, 2006). “In realtà questa è una limitazione che sottostà alla teoria della Relatività generale. In modelli più recenti, che prendono in considerazione gli effetti quantistici (in cui si applica la teoria della relatività di Einstein all’infinitamente piccolo, ndr), la nozione del tempo zero  svanisce. Significa che l’Universo potrebbe esistere prima del Big Bang, forse in uno stato fisico molto diverso”.

Potremo mai scoprire la verità su un passato così lontano, che risale a circa 13 miliardi e mezzo di anni fa? Forse sì. “Nello Spazio potrebbe essere ancora presente l’impronta delle onde gravitazionali generatesi nell’era pre-Big Bang”, avanza Luminet: “Le nostre attuali strumentazioni, però, non sono ancora così sensibili da testare questa ipotesi”. I telescopi gravitazionali di prossima generazione non si tireranno indietro di fronte alla sfida.

E se guardiamo avanti, quale futuro ci aspetta? Il tempo cosmologico finirà mai, o durerà per sempre? “Le recenti osservazioni sul futuro dell’Universo indicano che l’espansione sta accelerando, una scoperta che ha meritato il Premio Nobel per la fisica nel 2011: lo Spazio, infatti, sarebbe dominato da una forma di energia repulsiva chiamata energia oscura. Ma non ne conosciamo la natura e non possiamo quindi sapere se l’Universo continuerà ad accelerare per sempre o no”, risponde Luminet.

Nelle equazioni si potrà anche speculare che il tempo non esiste e che, se esiste, non finirà mai. Nella realtà, no. “Il tempo scorre uguale per tutti”, dice lo scienziato: “Ma può esser usato in modo più o meno efficiente. Il mio consiglio? Non spendetelo a fare cose insignificanti. Usatelo per accrescere la conoscenza, per il progresso della società e dell’umanità”. 

Fonte: wired.it

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Astfel, persoanele care consuma multa carne rosie prezinta un risc mai mare de a suferi un atac cerebral decât cei care consuma carne de pui sau curcan.

Cea mai importanta concluzie a acestui studiu este aceea ca tipul de proteine din dieta de zi cu zi afecteaza riscul unui atac cerebral în mod diferit. Trebuie sa evaluam beneficiile proteinelor tinând cont de contextul alimentelor pe care le consumam", a comentat Dr. Frank Hu, profesor la Scoala de Sanatate Publica Harvard, unul din autorii acestei cercetari.

Persoanele care consumă multă carne roşie prezintă un risc sporit de atac cerebral

Profesorul Hu si colegii sai au colectat date din doua studii vaste ce au urmarit starea de sanatate a zeci de mii de barbati si femei de la vârsta mijlocie pâna la moarte.

De-a lungul celor 20 de ani de studiu, aproximativ 1.400 de barbati si 2.600 de femei au suferit un atac cerebral.

Pentru a descoperi care este efectul proteinelor din dieta asupra riscului de a suferi un atac cerebral, cercetatorii au împartit persoanele studiate în functie de consumul zilnic de carne rosie, carne de pui, lactate si alte surse de proteine.

Barbatii care mâncau mai mult de doua portii de carne rosie pe zi prezentau un risc de a suferi un atac cerebral cu 28% mai mare decât cei care consumau o treime dintr-o portie de carne rosie pe zi. Cercetatorii au definit o portie de carne rosie în jurul valorii de 110-170 de grame.

Femeile care consumau aproximativ doua portii de carne rosie pe zi prezentau un risc de a suferi un atac cerebral cu 19% mai mare decât cele care mâncau mai putin de jumatate de portie zilnic.

Cercetatorii au descoperit ca persoanele care renuntau la o portie de carne rosie pe zi, obtinându-si proteinele dintr-o portie de carne de pui sau curcan, îsi reduceau riscul de atac cerebral cu 27%. În rândul celor care înlocuiau portia de carne rosie cu nuci sau peste riscul se reducea cu 17%, iar în rândul celor care înlocuiau carnea rosie cu lactate, riscul scadea cu 10%.

Dr. Adam Bernstein, unul dintre autorii studiului, spune ca nu este surprins de rezultatele care arata ca persoanele care consuma multa carne rosie sufera mai multe atacuri cerebrale.

"Am studiat si efectul pe care îl are carnea rosie asupra riscului de a face diabet sau de a suferi de afectiuni coronariene, asa ca mi se pare logic ca aceste afectiuni cardio-metabolice sa fie grupate", a afirmat Bernstein.

Cercetatorii au fost surprinsi sa descopere ca persoanele care au consumat peste nu au prezentat un risc mai mic de a suferi un atac cerebral. "Este posibil ca beneficiile oferite de peste sa depinda de modul în care acesta este preparat. Modul în care oamenii gatesc pestele variaza semnificativ, iar studiul nostru nu a fost suficient de detaliat pentru a ne permite sa studiem retetele", a explicat Bernstein.

În urma acestei cercetari, oamenii de stiinta recomanda limitarea consumului de carne rosie, în favoarea celei de pui si peste.

Sursa: Reuters - via descopera.ro

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Researchers at the U.S. Department of Energy's (DOE) Argonne National Laboratory have developed an extraordinarily efficient two-step process that electrolyzes, or separates, hydrogen atoms from water molecules before combining them to make molecular hydrogen (H2), which can be used in any number of applications from fuel cells to industrial processing.

Easier routes to the generation of hydrogen have long been a target of scientists and engineers, principally because the process to create the gas requires a great deal of energy. Approximately 2 percent of all electric power generated in the United States is dedicated to the production of molecular hydrogen, so scientists and engineers are searching for any way to cut that figure. "People understand that once you have hydrogen you can extract a lot of energy from it, but they don't realize just how hard it is to generate that hydrogen in the first place," said Nenad Markovic, an Argonne senior chemist who led the research.

This image depicts the series of reactions by which water is separated into hydrogen molecules and hydroxide (OH-) ions. The process is initiated by nickel-hydroxide clusters (green) embedded on a platinum framework (gray). Credit: Flikr

While a great deal of hydrogen is created by reforming natural gas at high temperatures, that process creates carbon-dioxide emissions. "Water electrolyzers are by far the cleanest way of producing hydrogen," Markovic said. "The method we've devised combines the capabilities of two of the best materials known for water-based electrolysis."

Most previous experiments in water-based electrolysis rely on special metals, like platinum, to adsorb and recombine reactive hydrogen intermediates into stable molecular hydrogen. Markovic's research focuses on the previous step, which involves improving the efficiency by which an incoming water molecule would disassociate into its fundamental components. To do this, Markovic and his colleagues added clusters of a metallic complex known as nickel-hydroxide—Ni(OH)2. Attached to a platinum framework, the clusters tore apart the water molecules, allowing for the freed hydrogen to be catalyzed by the platinum.

"One of the most important points of this experiment is that we're combining two materials with very different benefits," said Markovic. "The advantage of using both oxides and metals in conjunction dramatically improves the catalytic efficiency of the whole system."

According to Argonne materials scientist George Crabtree, who helped to initiate the establishment of Argonne's energy conversion program, the researchers' success is attributable to their ability to work on what are known as "single-crystal" systems—defect-free materials that allow scientists to accurately predict how certain materials will behave at the atomic level. "We have not only increased catalytic activity by a factor of 10, but also now understand how each part of the system works. By scaling up from the single crystal to a real-world catalyst, this work illustrates how fundamental understanding leads quickly to innovative new technologies."

This work, supported by the DOE Office of Science, is reported in the December 2 issue of Science.

Source: Argonne National Laboratory

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Ha iniziato con un semplice circuito audio e si è ritrovato miliardario. La storia di Ray Dolby, pioniere delle tecnologie digitali nato a Portland (Oregon, Usa) il 18 gennaio 1933, passa letteralmente attraverso il muro del suono. Nel 1949, a soli 16 anni, il suo lavoretto part-time alla Ampex gli permette di mettere le mani sul primo registratore a nastro sul mercato. Così, dopo una laurea a Stanford e un dottorato a Cambridge, il giovane Dolby decide di mettersi in proprio e cambiare il modo in cui percepiamo i suoni incisi sui nastri magnetici.

Esatto, quel Dolby

Tutto inizia con la costruzione del primo compansore, un dispositivo elettronico capace di ridurre il rumore di fondo e i disturbi all'interno dei segnali acustici. L'idea, nata nel 1965 quando Dolby attraversa l'Oceano per fondare i Dolby Labs in Inghilterra, riscosse un grande successo tra gli studi di registrazione professionali. Così, tre anni più tardi, la nuova versione del circuito – il Dolby B-type – venne integrato all'interno dei registratori commerciali: era l'inizio di una grande scalata al successo. 

Nel 1976, Dolby torna negli States e stabilisce definitivamente l'azienda a San Francisco, la città dove aveva trascorso gran parte della sua infanzia. Nel frattempo, grazie a un brevetto riconosciuto nel '69, era nato il Dolby Sound System, ossia la tecnologia che ha dato una svolta agli effetti audio del cinema. In pratica, si trattava di un sistema per migliorare la qualità del parlato all'interno delle pellicole, dove spesso colonna sonora e dialoghi si mescolavano con scarsa qualità. Tanto per capire, il primo film a utilizzare il sistema Dolby è stato un capolavoro del cinema: Arancia Meccanica.

Con il passare del tempo, la tecnologia audio ha fatto altri passi in avanti tenendosi a stretto contatto con il mondo del cinema. Nel 1992, l'atmosfera di Batman il Ritorno è diventata a tutti gli effetti molto più avvolgente di qualsiasi altro film mai proiettato fino a allora. Il capolavoro di Tim Burton è stato il primo a sperimentare l'uso del sistema surround Dolby Stereo Digital, dove la traccia audio veniva scomposta in diversi canali, ciascuno collegato ad amplificatori collocati di fronte, ai lati e alle spalle del pubblico.

Nell'arco di pochi anni, il suono inizia a circondare gli spettatori anche dentro le loro case. Nel 1995 il sistema surround viene applicato all' home vision e si conferma come uno degli standard audio preferiti dai produttori cinematografici. Così, l'impresa fondata da Dolby cresce a dismisura e nel 2005 viene quotata in borsa. Ma nel 2011, dopo 45 anni di attività, il papà del sorround ha lasciato il direttivo dell'azienda per ritirarsi a vita privata e godersi il gruzzolo accumulato nel tempo. Secondo la rivista Forbes, Dolby è uno dei 400 uomini più ricchi d'America: si posiziona al 144° posto con un patrimonio da 2,9 miliardi di dollari.

Fonte: wired.it

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Compania Life Technologies Corp. a declarat recent ca tehnologia sa, ce va permite secventierea genomului uman într-o singura zi, la pretul de 1000 USD, va fi dezvoltata în colaborare cu Baylor College of Medicine, Yale School of Medicine si Broad Institute of Cambridge.

Analiza ADN-ului va fi pentru toate buzunarele

O alta companie americana, Illumina din San Diego, sustine ca va introduce o noua tehnologie, care va decoda complet genomul uman în decurs de 24 de ore.

Decodarea genomului uman în timp record va aduce mari avantaje medicinei, în special pentru cazurile în care cadrele medicale trebuie sa stabileasca vulnerabilitatea pacientilor la diferite afectiuni, riscul de alergii la anume produse medicamentoase sau sa încerce tratamente în premiera.

Costul de 1.000 $ pentru decodarea genomului este asemanator cu suma ceruta astazi de majoritatea laboratoarelor, declara Chris Nussbaum, co-director la Genom Sequencing and Analysis Porgram din cadrul Board Institute. Singura diferenta este durata de timp pâna la livrarea datelor cerute.

Pe de alta parte, Richard Gibbs, director al Human Genome Sequencing Center din Baylor declara: "Vom vedea daca aparatul se comporta la înaltimea asteptarilor noastre în termeni de costuri si exactitate. Noi ne pastram optimismul".

Sursa: AFP - via descopera.ro

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Getting there, however, will entail much more than incremental progress. It will require adopting entirely new technology and surmounting a formidable roster of technological problems. One of most daunting of those – identifying and characterizing the factors that cause contamination of key lithographic components – has begun to yield to investigators in PML's Sensor Science Division, who have made some surprising and counterintuitive discoveries of use to industry.

In general, feature size is proportional to the wavelength of the light aimed at masks and photoresists in the lithography process. Today's super-small features are typically made with "deep" ultraviolet light at 193 nm. "But now we're trying to make a dramatic shift by dropping more than an order of magnitude, down to extreme ultraviolet (EUV) at 13.5 nm," says physicist Shannon Hill of the Ultraviolet Radiation Group. "That's going to be a big change."

This chamber and attached apparatus (see diagram, bottom) are used to introduce various gases to test contamination build-up on multi-layer surfaces.

In fact, it complicates nearly every aspect of lithography. It will necessitate new kinds of plasmas to generate around 100 watts of EUV photons. It demands a high-quality vacuum for the entire photon pathway because EUV light is absorbed by air. And of course it requires the elimination of chemical contaminants on the Bragg-reflector focusing mirrors and elsewhere in the system – contaminants that result from outgassing of materials in the vacuum chamber.

As a rule, the focusing mirrors are expected to last five years and decrease in reflectivity no more than 1 percent in that period. Innovative in-situ cleaning techniques have made that longevity possible for the present deep UV environments. But the EUV regime raises new questions. "How can we gauge how long they're going to last or how often they will have to be cleaned?" says Ultraviolet Group leader Thomas Lucatorto. "Cleaning is done at the expense of productivity, so the industry needs some kind of accelerated testing."

Unfortunately, Hill adds, "You can't even test one of these mirrors until you know how everything outgasses. Ambient hydrocarbon molecules outgassing from all the components will adsorb on the mirror's surface, and then one of these high-energy photons comes along and, through various reactions, the hydrogen goes away and you're left with this amorphous, baked-on carbonaceous deposit."

But what, exactly, is its composition? How long does it take to form, and what conditions make it form faster or slower? To answer those questions, the researchers have been using 13.5 nm photons from the NIST synchrotron in a beam about 1 mm in diameter to irradiate a 12-by-18 mm target in multiple places.

"We built a chamber where we can take a sample, admit one of these contaminant gases at some controlled partial pressure, and then expose it to EUV and see how much carbon is deposited," Hill says. The chamber is kept at 10-10 torr before admission of contaminant gases, and the inside surface plated in gold. "Gold is very inert," Hill explains, "and we want to be able to pump the gases out of the chamber with no traces remaining."


Contamination forms on a clean multi-layer surface (top) when EUV photons react (middle) with gases, resulting in carbonaceous deposits (bottom). Photos: SEMATECH

In the course of building the chamber, "we learned some solid chemistry that was unfortunate," Lucatorto recalls. "These things are typically sealed with copper gaskets. Our stainless steel chamber was coated with gold, and we used copper gaskets. Well, it turns out that gold loves copper. It naturally forms a gold-copper alloy just by being in contact. So we could not take the flange off!"

"We made it even worse," Hill adds, "because we baked these chambers – heated them up to clean them off. So it was effectively welded. We had to get a crowbar and a hammer to get the edges apart. After that, we had our gaskets covered with silver."

The PML team uses two techniques to analyze the EUV-induced contamination – x-ray photoelectron spectroscopy (XPS), which reveals the atomic composition and some information on chemical state, and spectroscopic ellipsometry, which is very sensitive to variations in optical properties – integrated with data from surface scientists at Rutgers University.

"The great thing about spectroscopic ellipsometry," Hill says, "is that it can be done in air and it can map all the spots on the sample in 8 or 9 hours. But being NIST, we're concerned with measuring things accurately. And we've determined if you want to determine how much carbon is present, ellipsometry alone may not be the right way to go – it can give you some misleading answers. XPS is much slower. It takes around 4 hours just to do one spot. But the two techniques give complementary information, so we use both.

"There are several things we wanted to investigate, and one was the pressure scaling of the contamination rate – nanometers of carbon per unit time. Each spot was made in a very controlled way, at a known pressure and EUV dose. The key thing we started finding is that the rate does not scale linearly with pressure. It scales logarithmically. That's not at all what you'd expect. It's counterintuitive, and it has really important implications for the industry. You could spend millions of dollars designing a system in which you were able to lower the background partial pressure by, say, two orders of magnitude. You would think that you'd done a lot. But in fact, you would have only decreased your contamination rate by a factor of two – maybe."

In addition, PML collaborated with the research group at Rutgers that was headed by NIST alumnus Theodore Madey until his death in 2008. "They have a world-class surface-science lab that studies the fundamental physics of adsorption," Hill says. The Rutgers investigators found, contrary to simple models in which all the adsorption sites have the same binding energy, that in fact the measured adsorption energy changes with coverage. "That is," Hill explains, "as you put more and more molecules on, they are more and more weakly bound. That can qualitatively explain the logarithmic relation we found."


EUV lithography requires multiple mirrors (multi-layer Bragg reflectors) to position and focus the EUV beam.

"Shannon and Ted [Madey] were the first to fully explain this and present it to the surface-science community," Lucatorto says. Industry benefits because the work clearly shows manufacturers that they cannot evaluate a product's contamination potential by taking measurements at a single pressure or intensity.

In a parallel line of research, Hill, Lucatorto and the other members of the Ultraviolet Radiation Group – which includes Nadir Faradzhev, Charles Tarrio, and Steve Grantham – along with collaborator Lee Richter of the Surface and Interface Group, are studying the outgassing of different photoresists that may be used in EUV lithography.

The outgas characteristics have to be known in rigorous detail before a wafer and resist can be placed in an enormously expensive lithography apparatus. Using another station on the NIST synchrotron's Beam Line 1, they are exposing the photoresists to 13.5 nm light and measuring the outgassed substances both in the gas phase and as they are "baked" by EUV photons on a witness plate.

"There are commercially available ways to test resists using electrons as proxies for EUV light, under the assumption that the effects are relatively similar and scale in comparable ways," Hill says. "But right now, NIST is the only place available to any company to test these things using photons." So far, the throughput is around two a week.

"We'll get faster," Lucatorto says. "Companies would like us to do 10 or more a week. By comparison, for deep UV lithography – when contamination from outgassing was not as great a concern – resist manufacturers would test thousands of resists each month to refine lithographic quality."

Source: NIST

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

But that’s putting new demands on chip designers. Because handhelds are battery powered, energy conservation is at a premium, and many routine tasks that would be handled by software in a PC are instead delegated to special-purpose processors that do just one thing very efficiently. At the same time, handhelds are now so versatile that not everything can be hardwired: Some functions have to be left to software.

A hardware designer creating a new device needs to decide early on which functions will be handled in hardware and which in software. Halfway through the design process, however, it may become clear that something allocated to hardware would run much better in software, or vice versa. At that point, the designer has two choices: Either incur the expense — including time delays — of revising the design midstream, or charge to market with a flawed device.

At the Association for Computing Machinery’s 17th International Conference on Architectural Support for Programming Languages and Operating Systems, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) will present a new system that enables hardware designers to specify, in a single programming language, all the functions they want a device to perform. They can thereafter designate which functions should run in hardware and which in software, and the system will automatically churn out the corresponding circuit descriptions and computer code. Revise the designations, and the circuits and code are revised as well. The system also determines how to connect the special-purpose hardware and the general-purpose processor that runs the software, and it alerts designers if they try to implement in hardware a function that will work only in software, or vice versa.

The new system is an extension of the chip-design language BlueSpec, whose theoretical foundations were laid in the 1990s and early 2000s by MIT computer scientist Arvind, the Charles W. and Jennifer C. Johnson Professor of Electrical Engineering and Computer Science, and his students. BlueSpec Inc., a company that Arvind co-founded in 2003, turned that theoretical work into working, commercial code.

As Arvind explains, in the early 1980s, an engineer designing a new chip would begin by drawing pictures of circuit layouts. “People said, ‘This is crazy,’” Arvind says. “‘Why can’t I write this description textually?’” And indeed, 1984 saw the first iteration of Verilog, a language that lets designers describe the components of a chip and automatically converts those descriptions into a circuit diagram.

BlueSpec, in turn, offers an even higher level of abstraction. Instead of describing circuitry, the designer specifies a set of rules that the chip must follow, and BlueSpec converts those specifications into Verilog code. For many designers, this turns out to be much more efficient than worrying about the low-level details of the circuit layout from the outset. Moreover, BlueSpec can often find shortcuts that a human engineer might overlook, using significantly fewer circuit components to implement a given set of rules, and it can guarantee that the resulting chip will actually do what it’s intended to do.

For the new paper, Arvind, his PhD student Myron King, and former graduate student Nirav Dave (now a computer scientist at SRI International) expanded the BlueSpec instruction set so that it can describe more elaborate operations that are possible only in software. They also introduced an annotation scheme, so the programmer can indicate which functions will be implemented in hardware and which in software, and they developed a new compiler that translates the functions allocated to hardware into Verilog and those allocated to software into C++ code.

Today, King says, “if I consider my algorithm just to be a bunch of modules that I’ve hooked together somehow, and I want to move one of these modules into hardware, I actually have to re-implement it. I have to write it again in a different language. What we’re trying to give people is a language where they can describe the algorithm once and then play around with how the algorithm is partitioned.”

King acknowledges that BlueSpec’s semantics — describing an algorithm as a set of rules rather than as a sequence of instructions — “is a radical departure from the way that most people think about software.” And indeed, among chip designers, Verilog is still much more popular than BlueSpec. “But it’s precisely this way of thinking about computation that allows you to generate both hardware and software,” King says.

Rajesh Gupta, the Qualcomm Professor in Embedded Microsystems at the University of California at San Diego, who wasn’t involved in the research, agrees. “Oftentimes, you need a dramatic change, not for the sake of the change, but because the problem demands it,” Gupta says. But, he adds, “hardware design is hard to begin with, and if some group of very smart people at MIT — who are not exactly known for making things simple — comes up with what looks like a very sophisticated model, some people will say, ‘My chances of making a mistake here are so high that I better not risk it.’ And hardware designers tend to be a little bit more conservative, anyway. So that’s why the adoption faces challenges.”

Still, Gupta says, the ability to redraw the partition between hardware and software could be enticing enough to overcome hardware designers’ conservatism. If you’re designing hardware for portable devices, “you need to be more power efficient than you are today,” Gupta says. But, he says, a device that relies too heavily on software requires so many layers of interpretation between the code and the circuitry that “by the time it actually does anything useful, it has done many other things that are useless, which are infrastructural.” To design systems that avoid such unnecessary, energy-intensive work, “you need this integrated view of hardware and software,” he says.

Source: PhysOrg

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Dna e  rna sono oggi le uniche molecole della vita: si auto-organizzano, si replicano e si traducono in enzimi e proteine, sono presenti nelle cellule di ogni essere vivente. Ma, forse, non sono state le sole nella lunga storia della Terra. Nell’elenco dei possibili candidati spunta, infatti, una terza molecola: il  tna, in cui lo zucchero treosio sostituisce, rispettivamente, il desossiribosio e il ribosio di dna e rna.

Il  dna e l’ rna sono infatti molecole molto complesse, probabilmente troppo per essere state le prime forme di materiale genetico a comparire. Ecco, allora, che vari gruppi di ricerca fanno le loro ipotesi e testano la possibilità che in miliardi di anni si siano evolute (per poi scomparire) altre configurazioni. Il  tna è un’ipotesi che ha già diversi anni. Ora,  John Chaput e il suo team del Center for Evolutionary Medicine and Informatics, presso il Biodesign Institute dell’Arizona State University hanno creato delle molecole di Tna e ne hanno seguito l’evoluzione per la prima volta su un substrato in cui era presente di volta in volta una proteina diversa.

Le molecole si sono dimostrate in grado di auto-organizzarsi in forme tridimensionali complesse e di agganciare la proteina, sviluppando un alto grado di affinità. Lo studio è stato pubblicato su  Nature Chemistry e suggerisce che in futuro si possano far evolvere enzimi adatti a sostenere una prima  forma di vita basata sul tna. 

Come riporta New Scientist però, è improbabile che il tna sia stato un precursore di dna e rna perché, sebbene la sua struttura sia più semplice e più piccola, resta comunque molto complessa. C’è poi il fatto, ovviamente, che non è stata mai individuata in alcun organismo vivente. La ricerca, però, è importante anche alla luce delle informazioni che si potrebbero avere dalle prossime missioni spaziali in cerca di  vita su Marte e su altri corpi celesti.

Attualmente si pensa che la prima  molecola della vita in grado di duplicarsi sia stata l’ rna; recentemente, però, si sta facendo strada l’ipotesi che all’inizio vi fossero piuttosto dei  mix di acidi nucleici, come proposto dal premio Nobel 2009  Jack Szostak della Harvard University. In questo mosaico, potrebbero essere stati presenti vari cugini del nostro materiale genetico.  New Scientist ne elenca alcuni: il  pna (acido peptidonucleico), lo gna (acido gliconucleico) e l' ana (amyloid nucleic acid).

Fonte: wired.it

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Creveti fara ochi si anemone cu tentacule albe au fost fotografiate în apropierea unor fisuri în planseul oceanic, prin care iese apa a carei temperatura poate ajunge pâna la 450 de grade Celsius.

Vietăţi nemaivăzute au fost descoperite în jurul celor mai adânci izvoare hidrotermale submarine (VIDEO)

Izvoarele termale submarine, botezate Beebe Vent Field, în onoarea primului om de stiinta care s-a aventurat în adâncul oceanelor, au fost descoperite în Marea Caraibelor, la sud de Insulele Cayman.

În 2010, geochimistul Doug Connelly de la Centrul Britanic de Oceanografie si biologul Jon Copley de la Universitatea Southampton au folosit un robot submarin, cu capacitatea de a se scufunda la adâncimi mari, pentru a analiza fundul marii. Astfel au fost descoperite noi izvoare hidrotermale submarine care se ridica aproape trei kilometri de pe fundul marii, în apropiere de muntele submarin Dent.

Descoperirea arata ca aceste izvoare hidrotermale submarine sunt mult mai raspândite decât se credea initial. În plus, camerele de luat vederi de pe robotul submarin au surprins imagini uimitoare ale unor specii noi, printre care si un crevete depigmentat, cu aspect fantomatic, botezat Rimicaris hybisae si care creste în colonii de pâna la 2000 de exemplare pe metru patrat. Lipsiti de ochi normali, acesti creveti au un organ sensibil la lumina situat pe spate si care îi ajuta sa navigheze.

O alta specie înrudita, numita Rimicaris exoculata a fost descoperita în zona unui izvor hidrotermal submarin aflat la 4000 de kilometri de Dorsala Atlantica.

Pe lânga crevetele pal, la muntele Dent au fost gasite si alte vietati precum un peste cu aspect serpentiform, o specie necunoscuta de melci si un crustaceu amfipod, a carui înfatisare aminteste de un purice.

Sursa: AFP - via descopera.ro

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 

Using a specially designed facility, UCLA stem cell scientists have taken human skin cells, reprogrammed them into cells with the same unlimited property as embryonic stem cells, and then differentiated them into neurons while completely avoiding the use of animal-based reagents and feeder conditions throughout the process.Generally, stem cells are grown using mouse "feeder" cells, which help the stem cells flourish and grow. But such animal-based products can lead to unwanted variations and contamination, and the cells must be thoroughly tested before they can be deemed safe for use in humans.

The UCLA study represents the first time scientists have derived induced pluripotent stem (iPS) cells with the potential for clinical use and differentiated them into neurons in animal origin–free conditions using commercially available reagents to facilitate broad application, said Saravanan Karumbayaram, the first author of the study and an associate researcher with the Eli and Edythe Broad Center of Regenerative Medicine and Stem Cell Research at UCLA.

The Broad Center researchers also developed a set of standard operating procedures for the process so that other scientists can benefit from the derivation and differentiation techniques. The process was performed under good manufacturing practices (GMP) protocols, which are tightly controlled and regulated, so the cells created meet all the standards required for use in humans.

"Developments in stem cell research show that pluripotent stem cells ultimately will be translated into therapies, so we are working to develop the methods and systems needed to make the cells safe for human use," Karumbayaram said.

The study was published Dec. 7 in the early online edition of the inaugural issue of the peer-reviewed journal Stem Cells Translational Medicine, a new journal that seeks to bridge stem cell research and clinical trials.

Karumbayaram tested six different animal-free media formulations before arriving at a composition that generated the most robust pluripotent stem cells. He combined two commercial media solutions to create his own mix and tried different concentrations of an important growth factor.

"The colonies we get are of very good quality and are quite stable," said Karumbayaram, who compared his animal-free colonies to those created conventionally using mouse feeder cells. Efficiency did suffer. Fewer colonies were created using the animal-free feeders, but the colonies did remain stable for at least 20 passages.

The neurons that resulted from the process started life as a small skin-punch biopsy from a volunteer. The skin cells were then reprogrammed to become pluripotent stem cells with the ability to make any cell in the human body. These iPS cells were grown in colonies and were later coaxed into becoming neural precursor cells and, finally, neurons.

The animal-free cells were compared at every step in the process to cells produced by typical animal-based methods, Karumbayaram said, and were found to be of very similar quality.

"We were very excited when we saw the first colonies growing, because we were not sure it would be possible to derive and grow cells completely animal-free," he said.

Because the cells were grown in a special facility designed to culture animal-free cells, the testing and examination required to make clinical-grade cells should be much simpler, said William Lowry, senior author of the study and an assistant professor of molecular, cell and developmental biology in the UCLA Division of Life Sciences.

To date, at least 15 animal-free iPS cell lines have been created at the Broad Stem Cell Research Center.

"It's critical to note that we are nowhere near ready to use these cells in the clinic," Lowry said. "We are working to develop methods to make sure these cells are genetically stable and will be as safe as possible for human use. The main goal of this project was to generate a platform that will one day allow translation of stem cells to the clinic."

Source: Medical Xpress

Articolo (p)Link Commenti Commenti (0)  Storico Storico  Stampa Stampa
 
Ci sono 290 persone collegate

< aprile 2024 >
L
M
M
G
V
S
D
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
         
             

Titolo
en - Global Observatory (605)
en - Science and Society (594)
en - Video Alert (346)
it - Osservatorio Globale (503)
it - Scienze e Societa (555)
it - Video Alerta (132)
ro - Observator Global (399)
ro - Stiinta si Societate (467)
ro - TV Network (143)
z - Games Giochi Jocuri (68)

Catalogati per mese - Filed by month - Arhivate pe luni:

Gli interventi piů cliccati

Ultimi commenti - Last comments - Ultimele comentarii:
Now Colorado is one love, I'm already packing suitcases;)
14/01/2018 @ 16:07:36
By Napasechnik
Nice read, I just passed this onto a friend who was doing some research on that. And he just bought me lunch since I found it for him smile So let me rephrase that Thank you for lunch! Whenever you ha...
21/11/2016 @ 09:41:39
By Anonimo
I am not sure where you are getting your info, but great topic. I needs to spend some time learning much more or understanding more. Thanks for fantastic information I was looking for this info for my...
21/11/2016 @ 09:40:41
By Anonimo


Titolo





26/04/2024 @ 00:51:48
script eseguito in 658 ms