Friday 30 November 2012

Human Genome project, the first step towards personalized medicine


DNA, or deoxyribonucleic acid, is the hereditary material in humans and almost all other organisms. Nearly every cell in a person’s body has the same DNA. Most DNA is located in the cell nucleus (where it is called nuclear DNA), but a small amount of DNA can also be found in the mitochondria (where it is called mitochondrial DNA or mtDNA).
Human DNA consists of about 3 billion bases, and more than 99 percent of those bases are the same in all people. The order, or sequence, of these bases determines the information available for building and maintaining an organism, similar to the way in which letters of the alphabet appear in a certain order to form words and sentences. One of the most important scientific achievements of the twentieth century was to map the 25,000 genes of in human DNA called the Human Genome Project.
The project began in October 1990 and was initially headed by Aristides Patrinos, head of the Office of Biological and Environmental Research in the U.S. Department of Energy's Office of Science. Francis Collins directed the US National Institutes of Health (NIH) National Human Genome Research Institute efforts.
A working draft of the genome was announced in 2000 and a complete one in 2003, with further, more detailed analysis still being published. A parallel project was conducted outside of government by the Celera Corporation, or Celera Genomics, which was formally launched in 1998. Most of the government-sponsored sequencing was performed in universities and research centres from the United States, the United Kingdom, Japan, France, Germany and Spain. Researchers continue to identify protein-coding genes and their functions; the objective is to find disease-causing genes and possibly use the information to develop more specific treatments. It also may be possible to locate patterns in gene expression, which could help physicians glean insight into the body's emergent properties.

While the objective of the Human Genome Project is to understand the genetic makeup of the human species, the project has also focused on several other nonhuman organisms such as E. coli, the fruit fly, and the laboratory mouse. It remains one of the largest single investigative projects in modern science.
The Human Genome Project originally aimed to map the nucleotides contained in a human haploid reference genome (more than three billion).
The "genome" of any given individual (except for identical twins and cloned organisms) is unique; mapping "the human genome" involves sequencing multiple variations of each gene. The project did not study the entire DNA found in human cells; some heterochromatic areas (about 8% of the total genome) remain un-sequenced.

Thanks to the technology underlying genomics research, which has been improving exponentially every couple of years, similar to the way computer tech improves under Moore’s Law.
"Up to 2006, the various cycles of new technology and introduction were cutting costs in half for a similar product every 22 months," said Adam Felsenfeld, a program director at the National Human Genome Research Institute, which invests about as much money in DNA sequencing as the Sanger Institute.
Progress in DNA sequencing has been as breathtaking as computer possessors improved speeds. The Human Genome Project was estimated to cost $3 billion to sequence a single genome when it began in 1990, but cost reductions during the decade-long effort drove its actual cost closer to $300 million. By the end of the project, researchers estimated that if they were starting again, they could have sequenced the genome for less than $50 million.

By 2006, Harvard’s George Church estimated that his lab could sequence a genome for $2.2 million. In 2007, the sequencing of James Watson’s genome was said to cost less than $1 million. Looking into the future, the NIH wants genomes to cost a mere $100,000 by 2009, and $1,000 five years later.
With dropping costs and increasing speed, a flood of genetic data is flowing out of international institutes across the world. Previous progress was measured in gigabases (billions of DNA base pairs), but now major research centers are stepping up to the terabase level (trillions of bp, as they are abbreviated). (Human genomes contain about 3 gigabases.)
The next generation of technologies could come from a new set of companies like Pacific Biosciences and Helicos. The latter company’s technology, promises to deliver very long sequences of base pairs, perhaps up to 100,000 bp. That would allow scientists to spot new types of patterns as well as make assembling genomes much easier. expect technology to become viable within five years.
 The quest to harness the power of DNA to develop personalized medicine is on the threshold of a major milestone: the $1,000 genome sequencing.
Scientists say that breaking the $1,000 barrier roughly the price of an MRI test will accelerate an already fast-moving transformation in genetic discovery and drug development. Some experts believe a person's genetic code eventually will be used routinely to guide prevention and treatment of illnesses throughout life. Drug companies increasingly are identifying gene variants that they can target with drugs. And geneticists are identifying more and more diseases that result from a mutation in just one gene.
Genomic information also may give individuals information about their risk for a common disease and predict how one will respond to particular medications or environmental exposures, such as radiation from medical tests, according to the U.S. Department of Energy Genome Programs.
Whole-genome sequencing as opposed to identifying just a subset of genes suspected of being linked to an illness allows scientists to look broadly across all genes for mutations that are associated with diseases.
But in the past year or two, drugs based on genomic information have begun to reach the market. Still, the wider availability and lower price of sequencing raises the question of how to convert the flood of genetic data into useful information for drug development and treating patients. The future looks bright for genetics as the cost of sequencing will come down in price, the possibilities of a Taylor made cure is around the corner. With the hope of a preventative cure by way of Genomic diagnostics and replacing the mutated gene, the old ways of surgery maybe a thing of the past. Expect preventative cures by popping a pills and quite possible a reduction on hospital bills, after you mapped out your own DNA...


Thursday 29 November 2012

The Hotol project, a brief description


The Multi-Unit Space Transport And Recovery Device (MUSTARD) was a concept explored by the British Aircraft Corporation (BAC) around 1964-1965 for launching payloads weighing as much as 5,000 lb into orbit. It was never constructed. The British Government also began development of a SSTO-spaceplane, called HOTOL.

HOTOL, for Horizontal Take-Off and Landing, was a British air-breathing space shuttle effort by Rolls Royce and British Aerospace.
Designed as a single-stage-to-orbit (SSTO) reusable winged launch vehicle, it was to be fitted with a unique air-breathing engine, the RB545 called the Swallow, to be developed by Rolls Royce. The engine was technically a liquid hydrogen/liquid oxygen design, but dramatically reduces the amount of oxidizer needed to be carried on board by utilising atmospheric oxygen as the spacecraft climbed through the lower atmosphere.
Since propellant typically represents the majority of the takeoff weight of a rocket, HOTOL was to be considerably smaller than normal pure-rocket designs, roughly the size of a medium-haul airliner such as the McDonnell Douglas DC-9/MD-80. However, comparison with a rocket vehicle using similar construction techniques failed to show much advantage, and funding for the vehicle ceased.

HOTOL would have been 63 metres long, 12.8 metres high, 7 metres in diameter and with a wingspan of 28.3 metres. The unmanned craft was intended to put a payload of around 7 to 8 tonnes in orbit, at 300 km altitude. It was intended to take off from a runway, mounted on the back of a large rocket-boosted trolley that would help get the craft up to "working speed". The engine was intended to switch from jet propulsion to pure rocket propulsion at 26–32 km high, by which time the craft would be traveling at Mach 5 to 7. After reaching orbit, HOTOL was intended to re-enter the atmosphere and glide down to land on a conventional runway (approx 1,500 metres minimum).
During development, it was found that the comparatively heavy rear-mounted engine moved the center of mass of the vehicle rearwards. This meant that the vehicle had to be designed to push the center of drag as far rearward as possible to ensure stability over the entire flight regime.
Cooling the air from an exterior 1000 degrees Celsius to working temperates for the new engine became a problem too, as condensation and frosting blocked airflow and choked the intake supply.
 Redesign of the vehicle to do this cost a significant proportion of the payload, and made the economics unclear. In particular, some of the analysis seemed to indicate that similar technology applied to a pure rocket approach would give at least as good performance at lower cost.
In 1988 the government withdrew further funding, the project was approaching the end of its design phase but the plans were still speculative and dogged with aerodynamic problems and operational disadvantages.
A cheaper redesign, Interim HOTOL or HOTOL 2, to be launched from the back of a modified Antonov An-225 transport aircraft, was offered by BAe in 1991 but that too was rejected. Interim HOTOL was to have dispensed with an air-breathing engine cycle and was designed to use more conventional LOX and liquid hydrogen. Although the British government canceled the Hotol project Alan bond lead engineer looked towards European space agency for funding. But the Hotol project was deemed classified preventing the patent rights and the progress of a space plane, unless for military purposes.
In 1989 HOTOL co-creator Alan Bond formed Reaction Engines Limited which has since been working on the Reaction Engines Skylon vehicle intended to solve the problems of HOTOL. Much of the early work by Allen bond was to utilize his knowledge in computer aided design to work out the design flaws in Hotol. A fundamental problem of center of gravity was solved by simply placing the engines on the tip of the wings much like the design of a modern jet. The patent restriction for the Hotol project ended in 1993 and was later owed by Rolls Royce (who express no further interest), but Allen circumvented much of the thermodynamic ideas on the patent of which he first wrote. This allowed Reaction Engine to move forward and obtain support and funding.
The design problems of the original HOTOL project was ultimately solved by technology and by the will and determination of the original founders Allen Bond Richard Varvill and John Scott-Scott.
Despite the need for a 250 million pound to fund the next three-year development phase in which it plans to build a small-scale version of the complete engine, there seems to be a sense of inevitability that this company will succeed. Considering that there is no alternative then expensive chemical multi stage rockets.
While competitive space craft such as the US space shuttle retires from action, the scram-jet is the only contender. The future of orbital spacecraft need to embrace new concepts of propulsion. As road vehicles slowly change to economical alternatives, so should the space industry. The sad fact that political will power did not support Hotol during a time in which President Ronald Regan had plans to create a star-wars program. The orbital height of a military base as a tactical advantage would of relied heavily on a craft like HOTOL. Consider the possibilities if HOTOL would of been a success. A working fleet would of been flying in the 90's, and the possibilities of a orbital infrastructure might have transformed air and space travel. But despite the slow progress of this technology and the skeptics that would of mothball the project. HOTOL has reborn into SKYLON and a working prototype might be ready as soon as 2020...

Wednesday 28 November 2012

Battery technology, the next generation of vehicular power

Battery technology today has hit a point where small gadgets will not be enough for its uses, considering the battery of yesterday needs to be different for todays uses. Although small gadgets tend to consume less power its more likely peoples demands for longer lasting power are not diminishing. While the convenience of rechargeable batteries and wireless induced power top-up our phones, its likely we are now putting unreasonable power demands for our vehicles.

The over priced cost of fossil fuels to enable us to travel has some scientists rethinking the battery usage paradigm. Electric vehicles of yesterday were limited to milk vans at neighborhood distances, the lead acid battery allowed the early vans to economically recharge and reuse the battery for repeated use. Moving forward to todays evolved vehicles the electric car was once a slow progressing concept has now become, some technological equivalent to a swan that has blossomed to the Tesla electric vehicle. Although the dream of having electric car feels real, the reality is that the £20,000 car is only as good as its battery. Consider the battery in such a vehicle is different to normal petrol cars, we have to compare the energy density of a kilogram weight of a lithium ion battery to a kilogram of petrol.
While uranium 235 (nuclear power) has a energy density of 79,500,000 mega joules per kilogram, lithium ion has 0.72 mega joules per kilogram, compared to petrol at a respectable 35 mega joules. 
The idea of a replacement for petrol with todays battery technology seems overwhelming when looking at the figures and comparing energy densities. Zinc–air batteries (non-rechargeable), and zinc–air fuel cells, (mechanically-rechargeable) are electro-chemical batteries powered by oxidizing zinc with oxygen from the air. These batteries have high energy densities and are relatively inexpensive to produce. Sizes range from very small button cells for hearing aids, larger batteries used in film cameras that previously used mercury batteries, to very large batteries used for electric vehicle propulsion with an energy density of 1.692, 4.932 MJ/kg
Metallic zinc could be used as an alternative fuel for vehicles, either in a zinc–air battery or to generate hydrogen near the point of use. Zinc's characteristics have motivated considerable interest as an energy source for electric vehicles. Gulf General Atomic demonstrated a 20 kW vehicle battery. General Motors conducted tests in the 1970s. Neither project led to a commercial product.
Meanwhile a Swiss company says it has developed rechargeable zinc-air batteries that can store three times the energy of lithium ion batteries, by volume, while costing only half as much. ReVolt, of Staefa, Switzerland, plans to sell small "button cell" batteries for hearing aids starting next year and to incorporate its technology into ever larger batteries, introducing cell-phone and electric bicycle batteries in the next few years. It is also starting to develop large-format batteries for electric vehicles.
The battery design is based on technology developed at SINTEF, a research institute in Trondheim, Norway. ReVolt was founded to bring it to market and so far has raised 24 million euros in investment. James McDougall, the company's CEO, says that the technology overcomes the main problem with zinc-air rechargeable batteries--that they typically stop working after relatively few charges. If the technology can be scaled up, zinc-air batteries could make electric vehicles more practical by lowering their costs and increasing their range.

Unlike conventional batteries, which contain all the reactants needed to generate electricity, zinc-air batteries rely on oxygen from the atmosphere to generate current.
The battery chemistry is also relatively safe because it doesn't require volatile materials, so zinc-air batteries are not prone to catching fire like lithium-ion batteries.
Because of these advantages, nonrechargeable zinc-air batteries have long been on the market. But making them rechargeable has been a challenge. Inside the battery, a porous "air" electrode draws in oxygen and, with the help of catalysts at the interface between the air and a water-based electrolyte, reduces it to form hydroxyl ions. These travel through an electrolyte to the zinc electrode, where the zinc is oxidized--a reaction that releases electrons to generate a current. For recharging, the process is reversed: zinc oxide is converted back to zinc and oxygen is released at the air electrode. But after repeated charge and discharge cycles, the air electrode can become deactivated, slowing or stopping the oxygen reactions. This can be due, for example, to the liquid electrolyte being gradually pulled too far into the pores, Henriksen says. The battery can also fail if it dries out or if zinc builds up unevenly, forming branch-like structures that create a short circuit between the electrodes.
ReVolt says it has developed methods for controlling the shape of the zinc electrode (by using certain gelling and binding agents) and for managing the humidity within the cell. It has also tested a new air electrode that has a combination of carefully dispersed catalysts for improving the reduction of oxygen from the air during discharge and for boosting the production of oxygen during charging. Prototypes have operated well for over one hundred cycles, and the company’s first products are expected to be useful for a couple of hundred cycles. McDougall hopes to increase this to between 300 and 500 cycles, which will make them useful for mobile phones and electric bicycles.

For electric vehicles, ReVolt is developing a novel battery structure that resembles that of a fuel cell. Its first batteries use two flat electrodes, which are comparable in size. In the new batteries, one electrode will be a liquid–a zinc slurry. The air electrodes will be in the form of tubes. To generate electricity, the zinc slurry, which is stored in one compartment in the battery, is pumped through the tubes where it’s oxidized, forming zinc oxide and releasing electrons. The zinc oxide then accumulates in another compartment in the battery. During recharging, the zinc oxide flows back through the air electrode, where it releases the oxygen, forming zinc again.
In the company’s planned vehicle battery, the amount of zinc slurry can be much greater than the amount of material in the air electrode, increasing energy density. Indeed, the system would be like a fuel-cell system or a conventional engine, in that the zinc slurry would essentially act as a fuel–pumping through the air electrode like the hydrogen in a fuel cell or the gasoline in a combustion engine. The batteries could also last longer–from 2,000 to 10,000 cycles. And, if one part fails–such as the air electrode–it could be replaced, eliminating the need to buy a whole new battery. While large zinc deposits in the US could give the country a cash injection in terms of an alternative to fossil fuels. IBM have been working on a batter based on Lithium air technology.
The major appeal of the Li-air battery is the extremely high energy density, a measure of the amount of energy a battery can store for a given volume, which rivals that of traditional gasoline powered engines. Li-air batteries gain this advantage in energy density since they use oxygen from the air instead of storing an oxidizer internally.
While the energy density of gasoline is approximately 13 kWh/kg, which corresponds to 1.7 kWh/kg of energy provided to the wheels when accounting for losses. The theoretical energy density of the lithium-air battery is 12 kWh/kg excluding the oxygen mass. It has been theorized that a practical energy density of 1.7 kWh/kg at the wheels of an automobile could be realized when accounting for over-potentials, other cell components, battery pack ancillaries, and the much higher efficiency of electric motors.

Stan Whittingham, professor of chemistry and material science at the State University New York,spoke of the possibility that the expected power density of lithium air may not be the same as petrol.
A more realistic estimate is the 800 watt hours per kilogram that startup PolyPlus is hoping to achieve. PolyPlus received a Department of Energy grant to work on encapsulating the lithium electrode to create a more stable battery system.
Whittingham said lithium air batteries might find good uses elsewhere, just not in cars. Whittingham also looked to “IBM is one of the culprits for hyping,” lithium air battery technology for electric cars. Even if IBM delivers on this technology, everyone agrees that commercializing lithium air batteries will take a long time. IBM said its project won’t likely lead to a commercial product until 2020 or 2030.
The 10 or 20 year commercial expectancy of the next generation of batteries (with Lithium Air system) might be a slight disappointment, considering most cars will not last that long. While todays electric vehicles can reach high speeds and give hope of a clean future. Lithium ion doesn't promise the sustainable replacement of fossil fuel. This current age we live holds many wonders and ideas but still feels like a technological no mans land between the gadgets we want and the science we know already. The zinc battery is the closest  we have to an alternative to lithium ion, but again progress feels slow. The probable solution I suspect will lay in increasing surface area on the nano scale, thus allow more electrochemical interactions between anode and cathode. The problem is to oxygenate the system to unlock the energy density, hopefully we might get a working prototype soonish...


Tuesday 27 November 2012

The Hypothetical Possibilities of Space Travel

Space drives weather fictitious or theoretical, will be playing a an important part in space exploration in the next couple of decades. Although governments seem to tighten their budgets, the private sector will see it as an business opportunity to boldly go where no franchise has gone before. Establishing a tourist rides seems fairly simple considering aeroplane technology can provide some height. The rest of the journey is then provided by hybrid jet to a low earth orbit as described on virgin Galactic travel description.
The success of these travel excursions will hopefully filter down on perhaps the research and development. Already reaction engines a new company has plans to create a new type of rocket the rock provides the duality of atmospheric air travel as a normal jet engine, as well as the contained components of a rocket engine. The secret to the engine is cooling system as with any craft leaving the atmosphere they need to reach hypersonic speeds. A task quite difficult as any air intake at those speeds reach a temperature 1000 degrees, which is unsuitable for a normal jet engines air supply. Instead a special cooling system and extra plumbing to change a engine from atmospheric breathing to contained oxygen hydrogen mix is required. The engine still is in testing stages for its pre-cooler, but if give an unlimited budget the Skylon plane still wont be ready until ten years time.
The possibility of a orbiting engine still gives us hope that progress is continuing, although ten years is a long time to wait for cheap space travel.

Further inconvenience of zero gravity and no atmosphere will create more problems, as propulsion to distant celestial objects would required something more then chemical propulsion. According to NASA physicist Harold White, by placing a spheroid object between two regions of space-time one expanding, the other contracting. Alcubierre theorized you could create a “warp bubble” that moves space-time around the object, effectively re-positioning it. In essence, you’d have the end result of faster-than-light travel without the object itself having to move (with respect to its local frame of reference) at light-speed or faster. The only catch: Alcubierre says that, “just as happens with wormholes,” you’d need “exotic matter” (matter with “strange properties”) to distort space-time. And the amount of energy necessary to power that would be on par with the mass-energy of the planet Jupiter.
White, who just shared his latest ideas at the 100 Year Starship 2012 Public Symposium, says that if you adjust the shape of the ring surrounding the object, from something that looks like a flat halo into something thicker and curvier, you could power Alcubierre’s warp drive with a mass roughly the size of NASA’s Voyager 1 probe.
In other words: reduction in energy requirements from a planet with a mass equivalent to over 300 Earths, down to an object that weighs just under 1,600 pounds.
What’s more, if you oscillate the space warp, White claims you could reduce the energy load even further. Theoretically exotic matter provide some solution, but that would require the faith of recreating the fictional warp engines. Using antimatter and matter as a controlled explosion in a dilithium crystal matrix to possibly make exotic matter. Although it is impossible now
Dr. White's team is trying to find proof of those loopholes. They have "initiated an interferometer test bed that will try to generate and detect a microscopic instance of a little warp bubble" using an instrument called the White-Juday Warp Field Interferometer.
As well as regressing to star-trek physics, the alternative is nuclear power source to provide substantial electrical power for a plasma drive. The Variable Specific Impulse Magnetoplasma Rocket (VASIMR) is an electro-magnetic thruster for spacecraft propulsion. It uses radio waves to ionize and heat a propellant, and magnetic fields to accelerate the resulting plasma to generate thrust. It is one of several types of spacecraft electric propulsion systems.
The method of heating plasma used in VASIMR was originally developed as a result of research into nuclear fusion. VASIMR is intended to bridge the gap between high-thrust, low-specific impulse propulsion systems and low-thrust, high-specific impulse systems. VASIMR is capable of functioning in either mode. Costa Rican scientist and former astronaut Franklin Chang-Diaz created the VASIMR concept and has been working on its development since 1977.
Plasma provides a decent enough propulsion material for traveling to planets around the solar system, as a trip to Mars will probably take 5 or 6 weeks instead of a year for conventional rockets.

The construction of a nuclear reactor on earth is roughly $14.5 billion according to the Darlington Nuclear Generating Station price tag. The engineering and construction problems from a reactor would increase in problematic difficulties at zero gravity. The possible solution is a contained capsule similar to a thorium reactor  which would reduce build cost to 1 billion dollars. Also the safely aspects of a reactor would not overheat and cause a Fukushima incident in space. Additionally the shielding for such a craft can be economically placed in the living quarters.
Alternatively, cold fusion which has been mistakenly disregarded has resurfaced to allow a hypothetically a cheap solution, the construction of a cold fusion cell.
Dr. Focardi has been publishing strong results with nickel-hydrogen fusion since 1994. A 1996 paper reported two cells that ran 300 days producing 250 and 167 kilowatt-hours of excess heat. Andrea Rossi is an inventor and businessman who hired Dr. Focardi in 2007 as a consultant. He has been financing the entire development with his own money. Rossi's design uses a nickel powder with catalysts instead of nickel sheets. It is therefore capable of producing much more power. In 2010 they jointly published a paper that reported six different experiments with durations of up to 52 days. The longest experiment used 19 kWh of energy input to produce 3768 kWh of output energy. Although it is a long way for the 200,000 kw required for a plasma engine.
While Hot Fusion is a distant possibility with the slow construction of the International Thermonuclear Experimental Reactor ready in 2020. It is ruled out for the foreseeable future as a contender for space applications, (my own future).
Instead consider the possibility of heat pipe technology, which was invented at Los Alamos in 1963. A team of NASA and Department of Energy researchers has shown that a reliable nuclear reactor based on technology several decades before.
The experiment known as the Demonstration Using Flattop Fissions, or DUFF, is the first demonstration of a space nuclear reactor system to produce electricity in the United States since 1965, and the experiment confirms basic nuclear reactor physics and heat transfer for a simple, reliable space power system.
A heat pipe is a sealed tube with fluid inside that can efficiently transfer heat produced by a reactor with no moving parts. In the mid-1980s, Los Alamos developed a lithium heat pipe that transferred heat energy at a power density of 23 kilowatts per square centimeter—to understand the intensity of that amount of heat energy, consider that the heat emitted from the sun's surface is only 6 kilowatts per square centimeter. Lithium is placed inside a molybdenum pipe, which can operate at white-hot temperatures approaching 1,477 K (2,200°F). Once heated inside the pipe, the lithium vaporizes and carries heat down the pipe's length.
In 1996, the space shuttle Endeavor carried into space three Los Alamos heat-pipe prototypes. The designs of these liquid-metal prototypes were for use in advanced spacecraft. The pipes operated at temperatures in excess of 900°F, and performed flawlessly in all tests. In 2000, Los Alamos worked with NASA's Marshall Space Flight Center in developing heat pipes to generate electricity and propulsion in spacecraft designed to journey to the solar system's outer limits. Because heat pipes work efficiently in zero gravity environments, routine applications for them are to cool electronic elements aboard geostationary communication satellites.
Propulsion other then warp drives can be achieved, the only drawback is money and resources. For practicable travel into our solar system the plasma engine would require 200,000 kw. Besides the impracticable use of an extension cord to a space craft. The only solution is a contained nuclear reactor generating 0.2 gigawatt's of electrical power. Thorium reactors though theoretically possible is quite expensive at 1 billion dollars, and not really designed for zero gravity. While heat pipes and cold fusion may sound promising, experiments will still need to be scaled up to test for power generating at plasma engine levels. Current technology or a forward thinking drive may get us sooner. But for now expect very slow progress for then ten or so years until some form of space orbiting infrastructure has been established by space tourism. Although it is slow progress for space propulsion drives, there is still new innovative ideas which might emerge to help generate, store or transmit power. The future might have unforeseeable solutions, you just have to wait and see...


Monday 26 November 2012

Ultraviolet Light, a brief description


Ultraviolet (UV) light is electromagnetic radiation with a wavelength shorter than that of visible light, but longer than X-rays, that is, in the range 10 nm to 400 nm, corresponding to photon energies from 3 eV to 124 eV. It is so-named because the spectrum consists of electromagnetic waves with frequencies higher than those that humans identify as the colour violet. These frequencies are invisible to humans, but visible to a number of insects and birds.
UV light is found in sunlight (where it constitutes about 10% of the energy in vacuum) and is emitted by electric arcs and specialized lights such as black lights. It can cause chemical reactions, and causes many substances to glow or fluoresce. Most ultraviolet is classified as non-ionizing radiation.
The discovery of UV radiation was associated with the observation that silver salts darkened when exposed to sunlight. In 1801, the German physicist Johann Wilhelm Ritter made the hallmark observation that invisible rays just beyond the violet end of the visible spectrum darkened silver chloride-soaked paper more quickly than violet light itself. He called them "oxidizing rays" to emphasize chemical reactivity and to distinguish them from "heat rays," discovered the previous year at the other end of the visible spectrum. The simpler term "chemical rays" was adopted shortly thereafter, and it remained popular throughout the 19th century. The terms chemical and heat rays were eventually dropped in favour of ultraviolet and infrared radiation, respectively
The higher energies of the ultraviolet spectrum from wavelengths about 10 nm to 120 nm ('extreme' ultraviolet) are ionizing, but this type of ultraviolet in sunlight is blocked by normal dioxygen in air, and does not reach the ground. However, the entire spectrum of ultraviolet radiation has some of the biological features of ionizing radiation, in doing far more damage to many molecules in biological systems than is accounted for by simple heating effects (an example is sunburn). These properties derive from the ultraviolet photon's power to alter chemical bonds in molecules, even without having enough energy to ionize atoms.

Although ultraviolet radiation is invisible to the human eye, most people are aware of the effects of UV on the skin, called suntan and sunburn. In addition to short wave UV blocked by oxygen, a great deal (97%) of mid-range ultraviolet (almost all UV above 280 nm and most above 315 nm) is blocked by the ozone layer, and like ionizing short wave UV, would cause much damage to living organisms if it penetrated the atmosphere.
After atmospheric filtering, only about 3% of the total energy of sunlight at the zenith is ultraviolet, and this fraction decreases at other sun angles. Much of it is near-ultraviolet that does not cause sunburn. An even smaller fraction of ultraviolet that reaches the ground is responsible for sunburn and also the formation of vitamin D (peak production occurring between 295 and 297 nm) in all organisms that make this vitamin (including humans). The UV spectrum thus has many effects, both beneficial and damaging, to human health.
The most common form of UV radiation is sunlight, which produces three main types of UV rays:
  • UVA
  • UVB
  • UVC
UVA rays have the longest wavelengths, followed by UVB, and UVC rays which have the shortest wavelengths. While UVA and UVB rays are transmitted through the atmosphere, all UVC and some UVB rays are absorbed by the Earth’s ozone layer. So, most of the UV rays you come in contact with are UVA with a small amount of UVB. Like all forms of light on the EM spectrum, UV radiation is classified by wavelength. Wavelength describes the distance between the peaks in a series of waves.
However, the longer wave UVA and UVB can cause molecules to vibrate and rotate resulting in heating up. The shorter wave UVC (used in UV Sterilization) light will ionize many atoms and molecules as compared to the even shorter wave Gamma Rays which will ionize most atoms. Ultra violet sterilization is an effective tool for disease prevention in aquariums, ponds and for general water quality control such as control of Green Ponds or Cloudy Aquariums. As well the use of UVC Sterilization is useful in home, office, hospital air purification ( even UVC/Redox Blood therapy).
There is a lot of new evidence as to the benefits of UV sterilization for ALL fish, and many myths have been dispelled such as “UV Sterilizers destroying beneficial nitrifying bacteria”.
One of the most powerful methods of sterilizing any medical instrument that may penetrate the skin is through the use of ultraviolet light. Ultraviolet light used for sterilizing medical equipment is composed of light frequencies in the C band of the ultraviolet spectrum, otherwise known as UV-C. Sterilizing equipment is simple, but all surfaces must be exposed to the UV-C light for a specific amount of time. UV disinfection is a photochemical process. The microorganisms that pollute the indoor environment are DNA or RNA based. The cell membranes and DNA break down when exposed to high intensity UV at 253.7nm. This bond breakage translates into cellular membrane damage in which case the cells dye or DNA/RNA damage which renders the germs harmless because they can no longer reproduce and cause disease.

Cosmically speaking, Very hot objects emit some amount of UV radiation. The hotter the object, the more UV is emitted. Observing and recording the UV from astronomical objects such as planets in our solar system, stars, nebulae and galaxies enables us to gain extra information such as the temperature and chemical composition of these objects. The only problem is that our Earth’s ozone layer absorbs much of the UV and so these observations need to be made outside the Earth’s atmosphere. On board the Hubble Space Telescope, the Faint Object Spectrograph (FOS) and the Goddard High Resolution Spectrograph (GHRS) are used to collect and analyse UV light from interesting targets.
Astrochemistry is the study of the abundance and reactions of chemical elements and molecules in the universe, and their interaction with radiation. One particularly important experimental tool in astrochemistry is spectroscopy, the use of telescopes to measure the absorption and emission of UV light from molecules and atoms in various environments.
Our solar system Hydrogen is easily detected in the ultraviolet (UV) and visible ranges from its absorption and emission of light. This type of detection allows us to seek out sources of water or even other chemicals which could indicate the ingredients of life.
UV light maybe taken for granted and become synonymous with tanning and solar activity, but thanks to science it has become an important tool in sterilization and Astrochemistry. Perhaps with further study we can study the effects of DNA mutation and chemical interaction on the nano scale. The future of Ultra violet light can be so much more then disco lights and tanning booths, hopefully it wont be taken for granted any more...


Sunday 25 November 2012

Snipers, a brief description


A sniper is a highly trained marksman who operates alone or in pair who maintain close visual contact with the enemy and engages targets from concealed positions or physical distances exceeding the detection capabilities of the enemy personnel, without being detected. These sniper teams operate independently, with little combat asset support from their parent units. Snipers typically have highly selective and specialized training and use high-precision rifles and optics, and often have sophisticated communication assets to feed valuable combat information back to their units.
In addition to marksmanship, military snipers are trained in camouflage, field craft, infiltration, reconnaissance and observation. Snipers are especially effective when deployed within the terrain of urban warfare, or jungle warfare.
a sniper's primary function in warfare is to provide detailed reconnaissance from a concealed position and, if necessary, to reduce the enemy's fighting ability by striking at high value targets (especially officers).

During World War I, snipers appeared as deadly sharpshooters in the trenches. At the start of the war, only Imperial Germany had troops that were issued scoped sniper rifles.
Although sharpshooters existed on all sides, the Germans specially equipped some of their soldiers with scoped rifles that could pick off enemy soldiers showing their heads out of their trench. At first the French and British believed such hits to be coincidental hits, until the German scoped rifles were discovered. During World War I, the Germans received a reputation for the deadliness and efficiency of their snipers, partly because of the high-quality lenses the Germans could manufacture.
One of the best known battles involving snipers, and the battle that made the Germans reinstate their specialized sniper training, was the Battle of Stalingrad. Their defensive position inside a city filled with rubble meant that Soviet snipers were able to inflict significant casualties on the Wehrmacht troops. Because of the nature of fighting in city rubble, snipers were very hard to spot and seriously dented the morale of the German attackers. The best known of these snipers was probably Vassili Zaitsev, immortalized in the novel War of the Rats and the subsequent film Enemy At The Gates.

In the United States Armed Forces, sniper training was only very elementary and focused on being able to hit targets over long distances. Snipers were required to be able to hit a body over 400 meters away, and a head over 200 meters away. There was almost no concern with the ability to blend into the environment. Sniper training varied from place to place, resulting in a wide range of qualities of snipers.
Military sniper training aims to teach a high degree of proficiency in camouflage and concealment, stalking, observation and map reading as well as precision marksmanship under various operational conditions. Trainees typically shoot thousands of rounds over a number of weeks, while learning these core skills. Snipers are trained to squeeze the trigger straight back with the ball of their finger, to avoid jerking the gun sideways.
 The most accurate position is prone, with a sandbag supporting the stock, and the stock's cheek-piece against the cheek. In the field, a bipod can be used instead. Sometimes a sling is wrapped around the weak arm (or both) to reduce stock movement. Some doctrines train a sniper to breathe deeply before shooting, then hold their lungs empty while they line up and take their shot. Some go further, teaching their snipers to shoot between heartbeats to minimize barrel motion.
The key to sniping is accuracy, which applies to both the weapon and the shooter. The weapon should be able to consistently place shots within tight tolerances. The sniper in turn must utilize the weapon to accurately place shots under varying conditions. A sniper must have the ability to accurately estimate the various factors that influence a bullet's trajectory and point of impact such as: range to the target, wind direction, wind velocity, altitude and elevation of the sniper and the target and ambient temperature. Mistakes in estimation compound over distance and can decrease lethality or cause a shot to miss completely.

Multiple variables like wind speed and direction, human error due to jitter, recoil, range miscalculation and even curvature of the earth diminishing shot accuracy. Although the furthest record shot required skill there was a certain amount of luck with guessing when to shoot as the barrel jitters. A mile and a half distance kill could not be achieved by a rank amateur, considering that most of the training requires techniques in reducing movement. Until recently the first precision-guided firearm (PGF) system developed by TrackingPoint™ is putting fighter-jet style “lock and launch” technology into a rifle system to create the most accurate shooting system in the world that can enable anyone to be an elite long-range marksman in minutes and greatly increase one’s ability to take multiple shots at multiple ranges quickly and effectively.
TrackingPoint’s exclusive XactSystem™ is a PGF system that includes a custom rifle, precision conventional ammunition, a networked tracking scope with heads up display, and a guided trigger. TrackingPoint’s precision guided firearm virtually eliminates shooter errors in aim, trigger pull, environmental inputs and range miscalculation to deliver five-times the first shot success rate of traditional systems at targets up to 1,200 yards, regardless of shooter skill level or expertise.
With the XactSystem, shooters can tag, track and hit their aim point exactly – a shot process referred to as Tag, Track, Xact™ or TTX – because the PGF automatically adjusts for range, temperature, barometric pressure, spin drift, wind input, cant, inclination and more.
Simply paint the target with the tag to lock on, watch as the tag persists regardless of relative movement, align the reticle representing the firing solution with the tag, squeeze and hold the trigger to arm the system, and the networked tracking scope releases the guided trigger when the reticle and tag are optimally aligned. Having removed the guessing element of firing a target, there only remains exterior or environmental elements that could prove target inaccuracy. Wind changes and movement of the target are unforeseeable variables which can render a precision guided firearm to miss.
Meanwhile Engineering researchers at Sandia National Laboratories have developed a new "self guided" bullet capable of steering itself like a tiny, silent guided missile towards its target. The task of creating the new "smart" bullet was a daunting one, and has occupied the researchers for three years now. The inspiration for the project came from the stalled state of firearms development for the U.S. armed forces.
The resulting 4-inch long, half-an-inch in diameter 0.50 caliber round takes the input of its microoptical sensor and feeds it into an 8-bit CPU, housed inside the shell. The CPU runs correction algorithms and outputs corrections 30 times a second, in the form of signals to electromagnetic actuators attached to small fins on the bullet.
It has a range of 2 km with an accuracy of 8 inches and works on the principle of a laser painted target with the electronics at the front to steer and correct its trajectory from wind change or if the target moves slightly. Considering that there two companies Sandia National labs and DARPA who are conducting tests quietly, the early results from Sandia have been successful with the .50 cal bullet reaching at speeds of March 2.1.
Having a laser painted target which in this case eliminates external variables and ensures accuracy. Bullet aim technology and smart projectiles enables anyone with a trigger finger to operate and destroy a target. Skills slowly acquired in mastering in target kills are almost a waste of time when using such technology. The future of war with technology may be won, not by strategy or by numbers but by innovation and problem solving. The current battlefront may not be an obvious area with identifiable targets. But I am sure science will find a way to discriminate between hostiles and ordinary civilians...