Nanotechnology ("nanotech") is manipulation of matter on an atomic, molecular, and supramolecular scale. The earliest, widespread description of nanotechnology[1][2] referred to the particular technological goal of precisely manipulating atoms and molecules for fabrication of macroscale products, also now referred to as molecular nanotechnology. A more generalized description of nanotechnology was subsequently established by the National Nanotechnology Initiative, which defines nanotechnology as the manipulation of matter with at least one dimension sized from 1 to 100 nanometers. This definition reflects the fact that quantum mechanical effects are important at this quantum-realm scale, and so the definition shifted from a particular technological goal to a research category inclusive of all types of research and technologies that deal with the special properties of matter which occur below the given size threshold. It is therefore common to see the plural form "nanotechnologies" as well as "nanoscale technologies" to refer to the broad range of research and applications whose common trait is size. Because of the variety of potential applications (including industrial and military), governments have invested billions of dollars in nanotechnology research. Until 2012, through its National Nanotechnology Initiative, the USA has invested 3.7 billion dollars, the European Union has invested 1.2 billion and Japan 750 million dollars.[3]Nanotechnology as defined by size is naturally very broad, including fields of science as diverse as surface science, organic chemistry, molecular biology, semiconductor physics, microfabrication, molecular engineering, etc.[4] The associated research and applications are equally diverse, ranging from extensions of conventional device physics to completely new approaches based uponmolecular self-assembly, from developing new materials with dimensions on the nanoscale to direct control of matter on the atomic scale.
Scientists currently debate the future implications of nanotechnology. Nanotechnology may be able to create many new materials and devices with a vast range of applications, such as in nanomedicine, nanoelectronics, biomaterials energy production, and consumer products. On the other hand, nanotechnology raises many of the same issues as any new technology, including concerns about the toxicity and environmental impact of nanomaterials,[5] and their potential effects on global economics, as well as speculation about various doomsday scenarios. These concerns have led to a debate among advocacy groups and governments on whether special regulation of nanotechnology is warranted. The concepts that seeded nanotechnology were first discussed in 1959 by renowned physicist Richard Feynman in his talk There's Plenty of Room at the Bottom, in which he described the possibility of synthesis via direct manipulation of atoms. The term "nano-technology" was first used by Norio Taniguchi in 1974, though it was not widely known. Comparison of Nanomaterials Sizes Inspired by Feynman's concepts, K. Eric Drexler used the term "nanotechnology" in his 1986 book Engines of Creation: The Coming Era of Nanotechnology, which proposed the idea of a nanoscale "assembler" which would be able to build a copy of itself and of other items of arbitrary complexity with atomic control. Also in 1986, Drexler co-founded The Foresight Institute (with which he is no longer affiliated) to help increase public awareness and understanding of nanotechnology concepts and implications. Thus, emergence of nanotechnology as a field in the 1980s occurred through convergence of Drexler's theoretical and public work, which developed and popularized a conceptual framework for nanotechnology, and high-visibility experimental advances that drew additional wide-scale attention to the prospects of atomic control of matter. In the 1980s, two major breakthroughs sparked the growth of nanotechnology in modern era. First, the invention of the scanning tunneling microscope in 1981 which provided unprecedented visualization of individual atoms and bonds, and was successfully used to manipulate individual atoms in 1989. The microscope's developers Gerd Binnig and Heinrich Rohrer at IBM Zurich Research Laboratory received aNobel Prize in Physics in 1986.[6][7] Binnig, Quate and Gerber also invented the analogous atomic force microscope that year. Buckminsterfullerene C60, also known as the buckyball, is a representative member of thecarbon structures known asfullerenes. Members of the fullerene family are a major subject of research falling under the nanotechnology umbrella. Second, Fullerenes were discovered in 1985 by Harry Kroto, Richard Smalley, and Robert Curl, who together won the 1996 Nobel Prize in Chemistry.[8][9] C60 was not initially described as nanotechnology; the term was used regarding subsequent work with related graphene tubes (calledcarbon nanotubes and sometimes called Bucky tubes) which suggested potential applications for nanoscale electronics and devices. In the early 2000s, the field garnered increased scientific, political, and commercial attention that led to both controversy and progress. Controversies emerged regarding the definitions and potential implications of nanotechnologies, exemplified by the Royal Society's report on nanotechnology.[10] Challenges were raised regarding the feasibility of applications envisioned by advocates of molecular nanotechnology, which culminated in a public debate between Drexler and Smalley in 2001 and 2003.[11] Meanwhile, commercialization of products based on advancements in nanoscale technologies began emerging. These products are limited to bulk applications of nanomaterials and do not involve atomic control of matter. Some examples include the Silver Nano platform for using silver nanoparticles as an antibacterial agent, nanoparticle-based transparent sunscreens, carbon fiber strengthening using silica nanoparticles, and carbon nanotubes for stain-resistant textiles.[12][13] Governments moved to promote and fund research into nanotechnology, such as in the U.S. with the National Nanotechnology Initiative, which formalized a size-based definition of nanotechnology and established funding for research on the nanoscale, and in Europe via the EuropeanFramework Programmes for Research and Technological Development. By the mid-2000s new and serious scientific attention began to flourish. Projects emerged to produce nanotechnology roadmaps[14][15] which center on atomically precise manipulation of matter and discuss existing and projected capabilities, goals, and applications. Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is about 1 to 100 nanometers. Physicist Richard Feynman, the father of nanotechnology. Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering. The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled “There’s Plenty of Room at the Bottom” by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn't until 1981, with the development of the scanning tunneling microscope that could "see" individual atoms, that modern nanotechnology began. Medieval stained glass windows are an example of how nanotechnology was used in the pre-modern era. (Courtesy: NanoBioNet)It’s hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:
But something as small as an atom is impossible to see with the naked eye. In fact, it’s impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recently—about 30 years ago. Once scientists had the right tools, such as the scanning tunneling microscope (STM) and the atomic force microscope (AFM), the age of nanotechnology was born. Although modern nanoscience and nanotechnology are quite new, nanoscale materials were used for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didn’t know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with. Today's scientists and engineers are finding a wide variety of ways to deliberately make materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight, increased control of light spectrum, and greater chemical reactivity than their larger-scale counterparts. Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is about 1 to 100 nanometers. Physicist Richard Feynman, the father of nanotechnology. Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering. The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled “There’s Plenty of Room at the Bottom” by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn't until 1981, with the development of the scanning tunneling microscope that could "see" individual atoms, that modern nanotechnology began. Medieval stained glass windows are an example of how nanotechnology was used in the pre-modern era. (Courtesy: NanoBioNet)It’s hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:
But something as small as an atom is impossible to see with the naked eye. In fact, it’s impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recently—about 30 years ago. Once scientists had the right tools, such as the scanning tunneling microscope (STM) and the atomic force microscope (AFM), the age of nanotechnology was born. Although modern nanoscience and nanotechnology are quite new, nanoscale materials were used for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didn’t know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with. Today's scientists and engineers are finding a wide variety of ways to deliberately make materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight, increased control of light spectrum, and greater chemical reactivity than their larger-scale counterparts. Nanotechnology is science, engineering, and technology conducted at the nanoscale, which is about 1 to 100 nanometers. Physicist Richard Feynman, the father of nanotechnology. Nanoscience and nanotechnology are the study and application of extremely small things and can be used across all the other science fields, such as chemistry, biology, physics, materials science, and engineering. The ideas and concepts behind nanoscience and nanotechnology started with a talk entitled “There’s Plenty of Room at the Bottom” by physicist Richard Feynman at an American Physical Society meeting at the California Institute of Technology (CalTech) on December 29, 1959, long before the term nanotechnology was used. In his talk, Feynman described a process in which scientists would be able to manipulate and control individual atoms and molecules. Over a decade later, in his explorations of ultraprecision machining, Professor Norio Taniguchi coined the term nanotechnology. It wasn't until 1981, with the development of the scanning tunneling microscope that could "see" individual atoms, that modern nanotechnology began. Medieval stained glass windows are an example of how nanotechnology was used in the pre-modern era. (Courtesy: NanoBioNet)It’s hard to imagine just how small nanotechnology is. One nanometer is a billionth of a meter, or 10-9 of a meter. Here are a few illustrative examples:
But something as small as an atom is impossible to see with the naked eye. In fact, it’s impossible to see with the microscopes typically used in a high school science classes. The microscopes needed to see things at the nanoscale were invented relatively recently—about 30 years ago. Once scientists had the right tools, such as the scanning tunneling microscope (STM) and the atomic force microscope (AFM), the age of nanotechnology was born. Although modern nanoscience and nanotechnology are quite new, nanoscale materials were used for centuries. Alternate-sized gold and silver particles created colors in the stained glass windows of medieval churches hundreds of years ago. The artists back then just didn’t know that the process they used to create these beautiful works of art actually led to changes in the composition of the materials they were working with. Today's scientists and engineers are finding a wide variety of ways to deliberately make materials at the nanoscale to take advantage of their enhanced properties such as higher strength, lighter weight, increased control of light spectrum, and greater chemical reactivity than their larger-scale counterparts.
0 Comments
The kinetic energy of particles of non-ionizing radiation is too small to produce charged ions when passing through matter. For non-ionizing electromagnetic radiation (see types below), the associated particles (photons) have only sufficient energy to change the rotational, vibrational or electronic valence configurations of molecules and atoms. The effect of non-ionizing forms of radiation on living tissue has only recently been studied. Nevertheless, different biological effects are observed for different types of non-ionizing radiation.[3][5]
Even "non-ionizing" radiation is capable of causing thermal-ionization if it deposits enough heat to raise temperatures to ionization energies. These reactions occur at far higher energies than with ionization radiation, which requires only single particles to cause ionization. A familiar example of thermal ionization is the flame-ionization of a common fire, and the browning reactions in common food items induced by infrared radiation, during broiling-type cooking. The electromagnetic spectrum is the range of all possible electromagnetic radiation frequencies.[3] The electromagnetic spectrum (usually just spectrum) of an object is the characteristic distribution of electromagnetic radiation emitted by, or absorbed by, that particular object. The non-ionizing portion of electromagnetic radiation consists of electromagnetic waves that (as individual quanta or particles, see photon) are not energetic enough to detach electrons from atoms or molecules and hence cause their ionization. These include radio waves, microwaves, infrared, and (sometimes) visible light. The lower frequencies of ultraviolet light may cause chemical changes and molecular damage similar to ionization, but is technically not ionizing. The highest frequencies of ultraviolet light, as well as all X-rays and gamma-rays are ionizing. The occurrence of ionization depends on the energy of the individual particles or waves, and not on their number. An intense flood of particles or waves will not cause ionization if these particles or waves do not carry enough energy to be ionizing, unless they raise the temperature of a body to a point high enough to ionize small fractions of atoms or molecules by the process of thermal-ionization (this, however, requires relatively extreme radiation intensities). Radiation with sufficiently high energy can ionize atoms; that is to say it can knock electrons off atoms and create ions. Ionization occurs when an electron is stripped (or "knocked out") from an electron shell of the atom, which leaves the atom with a net positive charge. Because living cells and, more importantly, the DNA in those cells can be damaged by this ionization, exposure to ionizing radiation is considered to increase the risk of cancer. Thus "ionizing radiation" is somewhat artificially separated from particle radiation and electromagnetic radiation, simply due to its great potential for biological damage. While an individual cell is made of trillions of atoms, only a small fraction of those will be ionized at low to moderate radiation powers. The probability of ionizing radiation causing cancer is dependent upon the absorbed dose of the radiation, and is a function of the damaging tendency of the type of radiation (equivalent dose) and the sensitivity of the irradiated organism or tissue (effective dose).
If the source of the ionizing radiation is a radioactive material or a nuclear process such as fission or fusion, there is particle radiation to consider. Particle radiation is subatomic particles accelerated to relativistic speeds by nuclear reactions. Because of their momenta they are quite capable of knocking out electrons and ionizing materials, but since most have an electrical charge, they don't have the penetrating power of ionizing radiation. The exception is neutron particles; see below. There are several different kinds of these particles, but the majority are alpha particles, beta particles, neutrons, and protons. Roughly speaking, photons and particles with energies above about 10 electron volts (eV) are ionizing (some authorities use 33 eV, the ionization energy for water). Particle radiation from radioactive material or cosmic rays almost invariably carries enough energy to be ionizing. Much ionizing radiation originates from radioactive materials and space (cosmic rays), and as such is naturally present in the environment, since most rock and soil has small concentrations of radioactive materials. The radiation is invisible and not directly detectable by human senses; as a result, instruments such as Geiger counters are usually required to detect its presence. In some cases, it may lead to secondary emission of visible light upon its interaction with matter, as in the case of Cherenkov radiation and radio-luminescence. Graphic showing relationships between radioactivity and detected ionizing radiationIonizing radiation has many practical uses in medicine, research and construction, but presents a health hazard if used improperly. Exposure to radiation causes damage to living tissue; high doses result in Acute radiation syndrome(ARS), with skin burns, hair loss, internal organ failure and death, while any dose may result in an increased chance of cancer and genetic damage; a particular form of cancer, thyroid cancer, often occurs when nuclear weapons and reactors are the radiation source because of the biological proclivities of the radioactive iodine fission product, iodine-131.[3] However, calculating the exact risk and chance of cancer forming in cells caused by ionizing radiation is still not well understood and currently estimates are loosely determined by population based on data from the atomic bombing in Japan and from reactor accident follow-up, such as with the Chernobyl disaster. The International Commission on Radiological Protection states that "The Commission is aware of uncertainties and lack of precision of the models and parameter values", "Collective effective dose is not intended as a tool for epidemiological risk assessment, and it is inappropriate to use it in risk projections" and "in particular, the calculation of the number of cancer deaths based on collective effective doses from trivial individual doses should be avoided."[4] |
AuthorWrite something about yourself. No need to be fancy, just an overview. Archives
August 2016
Categories |