“By the 1980’s and 1990’s, Moore’s Law had emerged as the underlying assumption that governed almost everything in the Valley, from technology to business, education, and even culture”*…
… and Moore’s Law— the assertion that the number of transistors in a dense integrated circuit doubles approximately every two years– has indeed weathered pretty well since Gordon Moore coined it in 1965:
Still, all along the way there have been nay-sayers– those who believe that it simply must come to an end. Even Intel, which Moore co-founded after serving as Director of Research at Fairchild, the semiconductor pioneer at which he worked when he published his “law,” has now questioned whether “doubling every two years” can continue.
Of course, only time will tell. But believers can take heart in work recently published by Cornell researchers on their work with “quantum dots” [pictured at the top of this post].
Just as the single-crystal silicon wafer forever changed the nature of communication 60 years ago, Cornell researchers hope their work with quantum dot solids — crystals made out of crystals — can help usher in a new era in electronics.
The team has fashioned two-dimensional superstructures out of single-crystal building blocks. Using a pair of chemical processes, the lead-selenium nanocrystals are synthesized into larger crystals, then fused together to form atomically coherent square superlattices.
The difference between these and previous crystalline structures is the atomic coherence of each 5-nanometer crystal (a nanometer is one-billionth of a meter). They’re not connected by a substance between each crystal — they’re connected to each other directly. The electrical properties of these superstructures are potentially superior to existing semiconductor nanocrystals, with anticipated applications in energy absorption and light emission…
* “By the 1980’s and 1990’s, Moore’s Law had emerged as the underlying assumption that governed almost everything in the Valley, from technology to business, education, and even culture. The “law” said the number of transistors would double every couple of years. It dictated that nothing stays the same for more than a moment; no technology is safe from its successor; costs fall and computing power increases not at a constant rate but exponentially: If you’re not running on what became known as ” Internet time,” you’re falling behind.”
― John Markoff,
As we get small, we might send thoughtful greetings to a man who can no doubt come up with new uses for tat power– Seymour Papert; he celebrates his birthday on this date (as he was born on February 29, Leap Day, in 1928). Trained as a mathematician, Papert has been a pioneer of computer science, and in particular, artificial intelligence. He created the Epistemology and Learning Research Group at the MIT Architecture Machine Group (which later became the MIT Media Lab); he directed MIT’s Artificial Intelligence Laboratory; he authored the hugely-influential LOGO computer language; and he is a principal of the One Laptop Per Child Program. Called by Marvin Minsky “the greatest living mathematics educator,” Papert has won won a Guggenheim fellowship (1980), a Marconi International fellowship (1981), the Software Publishers Association Lifetime Achievement Award (1994), and the Smithsonian Award (1997).