Monday 31 December 2012

The News, a brief history



Before the invention of newspapers in the early 17th century, official government bulletins and edicts were circulated at times in some centralized empires. The first documented use of an organized courier service for the diffusion of written documents is in Egypt, where Pharaohs used couriers for the diffusion of their decrees in the territory of the State (2400 BC). This practice almost certainly has roots in the much older practice of oral messaging and may have been built on a pre-existing infrastructure.
In Ancient Rome, Acta Diurna, or government announcement bulletins, were made public by Julius Caesar. They were handwritten news sheets posted by the government in the public marketplace from the year 59 BC* to at least 222 AD*. ACTA DIURNA announced news of politics, trials, scandals, military campaigners and execution.
In China, early government-produced news sheets, called tipao, circulated among court officials during the late Han dynasty (202 BC – 202 AD). Between 713 and 734, the Kaiyuan Za Bao ("Bulletin of the Court") of the Chinese Tang Dynasty published government news; it was handwritten on silk and read by government officials. At some point during the Tang Dynasty (618-907), the Chinese used carved wooden blocks to print TIPAO, making them the first printed newspaper in history.

In 1582 there was the first reference to privately published news sheets in Beijing, during the late Ming Dynasty. In Early modern Europe, increased cross-border interaction created a rising need for information which was met by concise handwritten news sheets. In 1556, the government of Venice first published the monthly Notizie scritte, which cost one gazetta. These avvisi were handwritten newsletters and used to convey political, military, and economic news quickly and efficiently to Italian cities (1500–1700) sharing some characteristics of newspapers though usually not considered true newspapers. Due to low literacy rates, news was at times disseminated by town criers.
Relation aller Fürnemmen und gedenckwürdigen Historien, from 1605, is recognized as the world's first newspaper. The oldest news agency is the Agence France-Presse (AFP). It was founded in 1835 by a Parisian translator and advertising agent, Charles-Louis Havas as Agence Havas. In modern times, printed news had to be phoned in to a newsroom or brought there by a reporter, where it was typed and either transmitted over wire services or edited and manually set in type along with other news stories for a specific edition. Today, the term "breaking news" has become trite as commercial broadcasting United States cable news services that are available 24-hours a day use live satellite technology to bring current events into consumers' homes as the event occurs.
Events that used to take hours or days to become common knowledge in towns or in nations are fed instantaneously to consumers via radio, television, mobile phone, and the Internet.
The news servers the duality of information on current events and the historical details of past events. While technology has moved on and that all possible news cant be digested, certain edits or types of news have been place in the foreground compared with others. This type of distortion, in which entertainment is place in high regard can suggest propaganda.

News organizations are often expected to aim for objectivity; reporters claim to try to cover all sides of an issue without bias, as compared to commentators or analysts, who provide opinion or personal point-of-view. The result is a laying out of facts in a sterile, noncommittal manner, and then standing back to "let the reader decide" which view is true. Several governments impose certain constraints or police news organizations against bias. In the United Kingdom, for example, limits are set by the government agency Ofcom, the Office of Communications. Both newspapers and broadcast news programs in the United States are generally expected to remain neutral and avoid bias except for clearly indicated editorial articles or segments. Many single-party governments have operated state-run news organizations, which may present the government's views.
As with any propaganda, news propaganda may be spread for widely different reasons including governance, political or ideological motivations, partisan agendas, religious or ethnic reasons, and commercial or business motivations; their purposes are not always clear. News propaganda also can be motivated by national security reasons, especially in times of war or domestic upheaval.
During the 2010 financial crisis in Greece, the media openly played a protecting role towards the government. Mainly the news program of Mega Channel has been criticised by many other media as well as political parties as playing a role as part of the government propaganda in favour of the International Monetary Fund. Perhaps todays News have taken a turn for the worst as scandals of phone hacking and intrusive paparazzi sensationalize celebrities. The healthy awareness of the world comes with a lot of negative news stories. Small wonder why most people would bury themselves in good news...


Sunday 30 December 2012

Attention and distraction, the tools for popularity


In cognitive psychology there are at least two models which describe how visual attention operates. These models may be considered loosely as metaphors which are used to describe internal processes and to generate hypotheses that are falsifiable. Generally speaking, visual attention is thought to operate as a two-stage process. In the first stage, attention is distributed uniformly over the external visual scene and processing of information is performed in parallel.
In the second stage, attention is concentrated to a specific area of the visual scene (i.e. it is focused), and processing is performed in a serial fashion. The first of these models to appear in the literature is the spotlight model. The term "spotlight" was inspired by the work of William James who described attention as having a focus, a margin, and a fringe. The focus is an area that extracts information from the visual scene with a high-resolution, the geometric center of which being where visual attention is directed. Surrounding the focus is the fringe of attention which extracts information in a much more crude fashion (i.e. low-resolution).
This fringe extends out to a specified area and this cut-off is called the margin. The second model that is called the zoom-lens model, and was first introduced in 1983. This model inherits all properties of the spotlight model (i.e. the focus, the fringe, and the margin) but has the added property of changing in size. This size-change mechanism was inspired by the zoom lens you might find on a camera, and any change in size can be described by a trade-off in the efficiency of processing. In understand the principles of attention it is equally important to understand Distraction and how we can use it to our advantage.

Distraction is the divided attention of an individual or group from the chosen object of attention onto the source of distraction. Distraction is caused by: the lack of ability to pay attention; lack of interest in the object of attention; or the great intensity, novelty or attractiveness of something other than the object of attention. Distractions come from both external sources, and internal sources.
Multitasking could also be considered as distraction in situations requiring full attention on a single object (e.g. sports, academic tests, performance). The issue of distraction in the workplace is studied in interruption science. According to Gloria Mark, a leader in interruption science, the average knowledge worker switches tasks every three minutes, and, once distracted, a worker takes nearly a half-hour to resume the original task.
Alternatively Propagandising techniques of distraction are used in media manipulation. The idea is encourage the public to focus on a topic or idea that the compliance professional feels is supportive of their cause. By focusing attention, a particular ideology can be made to seem the only reasonable choice.
Magicians use distraction techniques to draw the audience's attention away from whichever hand is engaged in sleight of hand. Magicians can accomplish this by encouraging the audience to look elsewhere or by having an assistant do or say something to draw the audience's attention away. Sleight of hand is often used in close-up magic, performed with the audience close to the magician, usually within three or four meters, possibly in physical contact. It often makes use of everyday items as props, such as cards and coins.
In media or in retail and other forms of social interaction the tools of attention and distraction to enthesise or desensitise a piece of information can create false interest. Besides the basis need to survive and live, the constant bombardment of attention and distraction signals often motivates us to follow those avenues.  When fully immersed in a social event people will likely to follow the crowd and yet remain individual. In any occasion it is more educational to spot the objects and people that attract the most attention and leave them for last...



Saturday 29 December 2012

Peer to peer loans, a brief description in bypassing the Banks


Many economists have offered theories about how financial crises develop and how they could be prevented. There is little consensus, however, and financial crises are still a regular occurrence around the world.
When a bank suffers a sudden rush of withdrawals by depositors, this is called a bank run. Since banks lend out most of the cash they receive in deposits (see fractional-reserve banking), it is difficult for them to quickly pay back all deposits if these are suddenly demanded, so a run may leave the bank in bankruptcy, causing many depositors to lose their savings unless they are covered by deposit insurance. A situation in which bank runs are widespread is called a systemic banking crisis or just a banking panic.
Economists say that a financial asset (stock, for example) exhibits a bubble when its price exceeds the present value of the future income (such as interest or dividends) that would be received by owning it to maturity. If most market participants buy the asset primarily in hopes of selling it later at a higher price, instead of buying it for the income it will generate, this could be evidence that a bubble is present. If there is a bubble, there is also a risk of a crash in asset prices. Well-known examples of bubbles and crashes in stock prices and other asset prices include, the Wall Street Crash of 1929, the crash of the dot-com bubble in 2000–2001, and the now-deflating United States housing bubble.
An especially prolonged recession may be called a depression, while a long period of slow but not necessarily negative growth is sometimes called economic stagnation. Since these phenomena affect much more than the financial system they are not usually considered financial crises as such though there are links between the two.
A quarterly survey of commercial banks by the European Central Bank showed a surge in the number of institutions that were becoming more restrictive about who they lent to, because the banks themselves were having trouble raising money and were under pressure from regulators to reduce risk.
In the last couple of years, the lending industry has gone through an evolution and has given way to social lending as the new and promising mode of lending. Also known as peer- to- peer lending or person to person (P2P) lending. The main objective of the social lending hubs is to offer an online loan with the best interest rates and to make customers feel like they are borrowing from a friend or community. This peer to peer borrowing is increasingly being seen in a new light and is being considered as a part of community borrowing (which was more traditionally offered by small local community banks).
Peer-to-peer lenders offer a narrower range of services than traditional banks, and in some jurisdictions may not be required to have a banking license. Peer-to-peer loans, are funded by investors who can choose the loans they fund; sometimes as many as several hundred investors fund one loan; banks, on the other hand, fund loans with money from multiple depositors or money that they have borrowed from other sources; the depositors are not able to choose which loans to fund.. Because of these differences, peer-to-peer lenders are considered non-banking financial companies.

One of the main advantages of person-to-person lending for borrowers has been better rates than traditional bank rates can offer (often below 10%.) The advantages for lenders are higher returns than obtainable from a savings account or other investments. Both of these benefits are the result of disintermediation, since peer-to-peer lenders avoid the costs of physical branches, capital reserves, and high overhead costs borne by traditional financial institutions with many employees and costly locations.
Peer-to-peer lending also attracts borrowers who, because of their credit status or the lack of thereof, are unqualified for traditional bank loans. Because past behavior is frequently indicative of future performance and low credit scores correlate with high likelihood of default, peer-to-peer intermediaries have started to decline a large number of applicants and charge higher interest rates to riskier borrowers that are approved. Some broker companies are also instituting funds into which each borrower makes a contribution and from which lenders are recompensed if a borrower is unable to pay back the loan.
Because, unlike depositing money in banks, peer-to-peer lenders can choose themselves whether to lend their money to safer borrowers with lower interest rates or to riskier borrowers with higher returns, peer-to-peer lending is treated legally as investment and the repayment in case of borrower defaulting is not guaranteed by the government. But despite a certain risk for lenders peer to peer loans have been a growing trend since 2005 by Zopa. Avoiding the bad reputation of loans or loan sharks, the peer to peer business model has the simple idea of joining borrower and lending community. Without the securities of a central bank, it also avoids the problems of which banks seem to have caused. While other concepts like crowd funding allowing fans or donations from well meaning people to fund small start up companies to create an industry. Financial institutions don't appear to invest for future unless there are certain guarantees. As the public become discontented with financial scandals and continuing global money worries, its likely that the public will turn to peer to peer investment and loans. The future of creativity and innovation might relay on it, as more and people are turning to this type of banking. Time will tell if this is a sound alternative to savings and loans or if indeed it is another Bubble...

Friday 28 December 2012

Quantum Radar, a brief description of anti anti radar counter measure


Quantum radar is a hypothetical remote-sensing method based on quantum entanglement. One possible implementation of such technology has been developed and patented by defense contractor Lockheed Martin. It intends to create a radar system which provides a better resolution and higher detail than classical radar can provide.

Radar and lidar (Light Detection And Ranging) systems bounce radio or light signals off an object and measure how long they take to return. The returning information can be used to determine the object's position and height, or to calculate its speed.
Meanwhile Mehul Malik and colleagues at the University of Rochester, New York, utilized a special encryption from quantum cryptography. A polarized outgoing photon in one of two ways according to a sequence.
Their radar system measures the polarisations of the returning photons, using conventional algorithms to measure speed direction. But also checking the encrypted polarized sequence for proof of original signal, forcing anyone wishing to create a false beam to polarise the photons in the same sequence. But if that person tries to measure the photons arriving from the radar transmitter, quantum mechanics ensures that many of the true polarisations get lost.
So a false signal always has more wrongly polarized photons than the true beam, which would probably indicate as enemy signal and easily distinguishable. In a lab test, photons reflected from a model of a stealth bomber had an error rate of less than 1 per cent. When the team tried to spoof the photons as if they were reflected off a bird, over half had the wrong polarisation. The same principle can also be used to reveal if photons encoding a secret key have been intercepted.


For example if a stealth aircraft attempts to intercept these photons and resend them in a way that disguises its position, it would inevitably change the photons' quantum properties, revealing any interference. In order to jam our imaging system, the object must disturb the delicate quantum state of the imaging photons, thus introducing statistical errors that reveal its activity. Traditional methods of evasion include the use of chaff to generate extra noise, as a radar countermeasure.
In early 1942, a Telecommunications Research Establishment (TRE) researcher named Joan Curran investigated the idea and came up with a scheme for dumping packets of aluminium strips from aircraft to generate a cloud of false echoes. An early idea was to use sheets the size of a notebook page, these would be printed so they would also serve as propaganda leaflets. However, it was found the most effective version used strips of black paper backed with aluminium foil, exactly 27 by 2 centimetres (11 in × 0.79 in) and packed into bundles each weighing 1 pound (0.45 kg). The Head of the TRE, A. P. Rowe, code-named the device as "Window".

Meanwhile in Germany, similar research had led to the development of Düppel. The systems were all essentially identical in concept: small aluminium strips (or wires) cut to one-half of the target radar's wavelength. When hit by the radar, such lengths of metal resonate and re-radiate the signal. Opposing defences would find it almost impossible to distinguish the aircraft from the echoes caused by the chaff. Alternatively Electronic counter measures work by Jamming transmitting signals on the radar frequency to produce a noise level sufficient to hide echoes.
The jammer's continuous transmissions will provide a clear direction to the enemy radar, but no range information. Deception may use a transponder to mimic the radar echo with a delay to indicate incorrect range. Transponders may alternatively increase return echo strength to make a small decoy appear to be a larger target. Target modifications include radar absorbing coatings and modifications of the surface shape to either "stealth" a high-value target or enhance reflections from a decoy.

The new system of radar effectively renders all forms of radar counter measure useless. Despite passive radar using the subtle method of terrestrial signals from public transmission, which hides the knowledge of possible transmitters. The Quantum radar will accurately record height direction and speed, but at the cost of knowing radar is being used. Although  enemy aircraft will be able to detect the radar, it is unlikely that they can provide any countermeasures to quantum encryption radar scanning. This also makes the current fleet of stealth aircraft useless, which in retrospect seems like a regrettable avenue of research considering the program for the Northrop Grumman B-2 Spirit bomber was US$44.75 billion (through 2004).

Thursday 27 December 2012

Artificial Trees, the natural alturnative



Photosynthesis is a process used by plants and other organisms to convert the light energy captured from the sun into chemical energy that can be used to fuel the organism's activities. Photosynthesis occurs in plants, algae, and many species of bacteria, but not in archaea.

The first photosynthetic organisms probably evolved about 3,500 million years ago, early in the evolutionary history of life, when all forms of life on Earth were microorganisms and the atmosphere had much more carbon dioxide. They most likely used hydrogen or hydrogen sulfide as sources of electrons, rather than water. Cyanobacteria appeared later, around 3,000 million years ago, and drastically changed the Earth when they began to oxygenate the atmosphere, beginning about 2,400 million years ago. This new atmosphere allowed the evolution of complex life such as protists. Eventually, no later than a billion years ago, one of these protists formed a symbiotic relationship with a cyanobacterium, producing the ancestor of many plants and algae. The chloroplasts in modern plants are the descendants of these ancient symbiotic cyanobacteria.
Photosynthesis happens when water is absorbed by the roots of green plants and is carried to the leaves by the xylem, and carbon dioxide is obtained from air that enters the leaves through the stomata and diffuses to the cells containing chlorophyll. In the light reactions, one molecule of the pigment chlorophyll absorbs one photon and loses one electron. This electron is passed to a modified form of chlorophyll called pheophytin, which passes the electron to a quinone molecule, allowing the start of a flow of electrons down an electron transport chain that leads to the ultimate reduction of NADP to NADPH. In addition, this creates a proton gradient across the chloroplast membrane; its dissipation is used by ATP synthase for the concomitant synthesis of ATP. The chlorophyll molecule regains the lost electron from a water molecule through a process called photolysis, which releases a dioxygen (O2) molecule.

In the light-independent (or "dark") reactions, the enzyme RuBisCO captures CO2 from the atmosphere and in a process that requires the newly formed NADPH, called the Calvin-Benson Cycle, releases three-carbon sugars, which are later combined to form sucrose and starch.
carbon fixation produces an intermediate product, which is then converted to the final carbohydrate products. The carbon skeletons produced by photosynthesis are then variously used to form other organic compounds, such as the building material cellulose, as precursors for lipid and amino acid biosynthesis, or as a fuel in cellular respiration. The latter occurs not only in plants but also in animals when the energy from plants gets passed through a food chain.


While nature has provided essential oxygen and food from plants, man has attempted to utilize the sun for energy production. Solar cells have been able to generate clean power as each photon strikes the surface of a solar array freeing up an electron, its combined photoelectric effect seemed like the only principle to make use of the sun artificially.
Professor Ray Frost, from Queensland University of Technology's School of Physical and Chemical Sciences, said the idea of geosequestration was to trap carbon dioxide and to lock it into minerals deep underground. Geosequestration of carbon dioxide is one of the methods under debate to reduce greenhouse gases and their effects on climate change.
The chemical process of using calcium or sodium hydroxide to absorb CO2 has been known for years but the question of whether it can be done in an affordable energy efficient manner has not yet been fully answered. Constructing and erecting the collector device is only 20% of the cost; the remainder of the cost involves prying the CO2 loose from the absorbent and storing it- an energy intensive process. The back of the envelope calculation of total cost supposes $80 to $100 per ton of carbon captured, which is large as compared to the $25-$75 per ton cost that proponents of a carbon tax or cap-and-trade scheme believe will stabilize atmospheric emissions of CO2.
Trapping Carbon dioxide underground is a solution which offers a instant result, but considering carbon isn't recycled or that containment is finite and may have limits. An alternative solution may lay in increasing the number of plant life to off-set industrial levels of carbon dioxide production. Although natural resources like the Amazon and other rain forests are dwindling, recent claim of replicating photosynthesis may offer a natural solution. One of the biggest obstacles to artificial photosynthesis has been that scientists could only replicate it with a costly platinum catalyst.
Danial Nocera at Massachusetts Institute of Technology (MIT) says his team has found a way to replace it with a cheap nickel-molybdenum-zinc compound. This puts him one step closer to his goal of finding an inexpensive, portable source of renewable energy for developing countries.
Artificial leaves resemble a thin playing card, described by MIT as a "silicon solar cell with different catalytic materials bonded onto its two sides". Covered with water and placed in sunlight, it splits hydrogen and water, mimicking photosynthesis. I've got to say that the Nocera system is very good it's probably at the moment the best in the world, but there are other alternative approaches and many places are working on it," said Jim Barber, a biologist at Imperial College London.
Barber is part of another team researching artificial photosynthesis. His project uses iron oxide, or rust, as a cheap material to absorb light and serve as a semi-conductor. "The sun is the only energy source available to us of sufficient magnitude to satisfy our needs. That's why it's so important to continue to develop the research and development. The Nocera work is a giant leap forward towards this goal of capturing sunlight and storing it as a fuel," Barber explained. According to Barber, if artificial photosynthesis systems could use around 10% of the sunlight falling on them, they would only need to cover 0.16% of the Earth's surface to satisfy a global energy consumption rate of 20 terawatts, the amount it is predicted that the world will need in 2030.
Alternatively Panasonic has developed an artificial photosynthesis system which converts carbon dioxide (CO2) to organic materials by illuminating with sunlight at a world's top efficiency*1 of 0.2%. The efficiency is on a comparable level with real plants used for biomass energy. The key to the system is the application of a nitride semiconductor which makes the system simple and efficient. This development will be a foundation for the realization of a system for capturing and converting wasted carbon dioxide from incinerators, power plants or industrial activities.

Panasonic's artificial photosysnthesis system has a simple structure with highly efficient CO2 conversion, which can utilize direct sunlight or focused light. Nitride semiconductors have attracted attention for their potential applications in highly efficient optical and power devices for energy saving. However, its potential was revealed to extend beyond solid devices; more specifically, it can be used as a photo-electrode for CO2 reduction. Making a deviced structure through the thin film process for semiconductors, the performance as a photo-electrode has highly improved.
The CO2 reduction takes place on a metal catalyst at the opposite side of nitride semiconductor photo-electrode. The system with a nitride semiconductor and a metal catalyst generates mainly formic acid from CO2 and water with light at a world's top efficiency of 0.2%. The formic acid is an important chemical in industry for dye and fragrances. The reaction rate is completely proportional to the light power due to the low energy loss with simple structure; in other words, the system can respond to focused light. This will make it possible to realize a simple and compact system for capturing and converting wasted carbon dioxide from incinerators and electric generation plants.
While new research continues it may be possible to create a system mimicking photosynthesis, the closest  resemblance has not been able to turn the complex biological systems to a clean food source. The probable best solution for the time being is to contain the increasing number of carbon dioxide emissions. Although sweeping underground waste products seem like a solution, there may be hidden factors which might reappear as in most waste products. Its uncertain if Artificial trees will be the solution, but considering that natural resources are reducing in numbers and that industry doesn't appear to slow down. It might be wise to put research into Terraforming technology...

Wednesday 26 December 2012

stem cells, a brief description


Stem cells are biological cells found in all multicellular organisms, that can divide (through mitosis) and differentiate into diverse specialized cell types and can self-renew to produce more stem cells. In mammals, there are two broad types of stem cells: embryonic stem cells, which are isolated from the inner cell mass of blastocysts, and adult stem cells, which are found in various tissues. In adult organisms, stem cells and progenitor cells act as a repair system for the body, replenishing adult tissues. In a developing embryo, stem cells can differentiate into all the specialized cells (these are called pluripotent cells), but also maintain the normal turnover of regenerative organs, such as blood, skin, or intestinal tissues.
  • Bone marrow, which requires extraction by harvesting, that is, drilling into bone (typically the femur or iliac crest),
  • Adipose tissue (lipid cells), which requires extraction by liposuction,
  • Blood, which requires extraction through pheresis, wherein blood is drawn from the donor (similar to a blood donation), passed through a machine that extracts the stem cells and returns other portions of the blood to the donor
Stem cells can also be taken from umbilical cord blood just after birth. Of all stem cell types, autologous harvesting involves the least risk. By definition, autologous cells are obtained from one's own body, just as one may bank his or her own blood for elective surgical procedures.
Highly plastic adult stem cells are routinely used in medical therapies, for example in bone marrow transplantation. Stem cells can now be artificially grown and transformed (differentiated) into specialized cell types with characteristics consistent with cells of various tissues such as muscles or nerves through cell culture. Embryonic cell lines and autologous embryonic stem cells generated through therapeutic cloning have also been proposed as promising candidates for future therapies. Research into stem cells grew out of findings by Ernest A. McCulloch and James E. Till at the University of Toronto in the 1960s.
Recently medical advances have obtain stem cells from a unusual source which compared with the discomfort of biopsy, is less painful. Yuanyuan Zhang and his colleagues at the Wake Forest University School of Medicine in Winston-Salem, North Carolina, have made urethra-like tissue by growing stem cells extracted from the urine of four healthy volunteers on scaffolds made from pig gut tissue.
To do this, the team first converted stem cells extracted from urine into urothelial cells and smooth muscle cells - vital cell lines for making ureters, which empty fluid from the kidneys into the bladder, and urethras, which conduct it from the bladder out of the body. Zhang then chemically stripped all pig cells from layers of pig gut tissue, leaving just the underlying inert collagen scaffold. He coated this scaffold with the two types of cell. Two weeks later, the deposited cells had formed layers on the scaffolds resembling urethras and ureters. In another experiment, the same structures developed after the seeded scaffolds had been implanted in mice lacking an immune system, proving that the cells can survive and grow in live animals.
 Zhang plans further experiments in larger animals and eventually in humans. He and his colleagues hope to emulate the clinical success seen two years ago when researchers replaced a woman's damaged windpipe by growing her stem cells on a section of donated windpipe that had been stripped of the donor's cells. There seem to be ample stem cells in urine to make these structures. A single colony of converted cells can coat a scaffold up to 10 cubic centimetres in volume, and just 200 millilitres of urine contains enough stem cells to form 15 colonies, say the team. While harvesting cells may sound unique, an alternative macabre method maybe put into practice.
Dead bodies can provide organs for transplants, now they might become a source of stem cells too. Huge numbers of stem cells can still be mined from bone marrow five days after death to be potentially used in a variety of life-saving treatments.
Human bone marrow contains mesenchymal stem cells, which can develop into bone, cartilage, fat and other cell types. MSCs can be transplanted and the type of cell they form depends on where they are injected. Cells injected into the heart, for example, can form healthy new tissue, a useful therapy for people with chronic heart conditions. Unlike other tissue transplants, MSCs taken from one person tend not to be rejected by another's immune system. In fact, MSCs appear to pacify immune cells. It is this feature which has made MSC treatments invaluable for children with graft-versus-host disease, in which transplants aimed at treating diseases such as leukaemia attack the child instead. Stem cell therapies require a huge numbers of cells though, and it can be difficult to obtain a sufficient amount from a living donor.
Paolo Macchiarini, who researches regenerative medicine at the Karolinska Institute in Stockholm, Sweden, describes the work as an excellent advance but says that the cells may not be as healthy as they seem. Their DNA may be affected by the death of surrounding tissue and exposure to cold temperatures. "We need to make sure the cells are safe," he says.
Corneal stem cells taken from the eyes of fresh cadavers have already been used to treat blindness in people with eye conditions that result from injury and scarring, but Chris Mason at University College London sees a potential hurdle in using such MSCs in therapy. "The work is novel and intriguing... but it would be better to use a living donor," he says. That's partly because medical regulators oppose treating individuals with stem cells from more than one source. "You can always go back and get more stem cells from a living donor if you need them, but if you use a cadaver, you'll eventually run out."
Stem cells have a multitude of possible cures, as each cell is considered as a building block of a bio organism, the theory allows the possibility to program the blocks into specific cells for replacement or repair.

In 2006, Shinya Yamanaka made a groundbreaking discovery that would win him the Nobel Prize in Physiology or Medicine just six years later: he found a new way to ‘reprogramme’ adult, specialized cells to turn them into stem cells. These laboratory-grown stem cells are pluripotent – they can make any type of cell in the body - and are called induced pluripotent stem cells, or iPS cells. Only embryonic stem cells are naturally pluripotent. Yamanaka’s discovery means that theoretically any dividing cell of the body can now be turned into a pluripotent stem cell. Stem cells is still a relatively new science and while small breakthroughs in the news are helping to turn peoples mind on the ethical issues. As alternatives to using embryonic stem cells are making great advances it is uncertain how we might harvest cells for repair...


Tuesday 25 December 2012

Sous-vide cooking, boil in a bag technology


Sous-vide French for "under vacuum") is a method of cooking food sealed in airtight plastic bags in a water bath for longer than normal cooking times—72 hours in some cases—at an accurately regulated temperature much lower than normally used for cooking, typically around 55 °C (131 °F) to 60 °C (140 °F) for meats and higher for vegetables. The intention is to cook the item evenly, and to not overcook the outside while still keeping the inside at the same "doneness", keeping the food juicier.
In most cases, cooking sous vide simply involves two steps: the sealing of foods in plastic bags and submerging the bags into hot water baths for a period of time to heat through. The water is typically regulated at the desired final temperature of the food or just above. The food is held in the water bath until it reaches the same temperature as the water (and then held at that temperature until service or a final cooking step takes place such as searing). In many ways, this is similar to simmering (such as in a poached fish recipe), except the sous vide water is usually at a lower temperature and food is kept from making direct contact with the water by a barrier (such as a plastic bag or eggshell in the case of sous vide eggs, thus minimizing flavor and nutrient loss of the ingredients to the cooking liquid).
The theory was first described by Sir Benjamin Thompson (Count Rumford) in 1799 (although he used air as the heat transfer medium). It was re-discovered by American and French engineers in the mid-1960s and developed into an industrial food preservation method. The method was adopted by Georges Pralus in 1974 for the Restaurant Troisgros (of Pierre and Michel Troisgros) in Roanne, France. He discovered that when foie gras was cooked in this manner it kept its original appearance, did not lose excess amounts of fat and had better texture. Another pioneer in sous-vide is Bruno Goussault, who further researched the effects of temperature on various foods and became well known for training top chefs in the method. As chief scientist of Alexandria, Virginia-based food manufacturer Cuisine Solutions, Goussault developed the parameters of cooking times and temperatures for various food.
Sous-vide has become a common feature on television cooking shows. It has also been used to quickly produce significant quantities of meals for hurricane evacuees. Non-professional cooks are also beginning to use sous-vide cooking.
Clostridium botulinum bacteria can grow in food in the absence of oxygen and produce the deadly botulinum toxin, so sous-vide cooking must be performed under carefully controlled conditions to avoid botulism poisoning. Generally speaking, food that is heated and served within four hours is considered safe, but meat that is cooked for longer to tenderize must reach a temperature of at least 55 °C (131 °F) within four hours and then be kept there, in order to pasteurize the meat.

Pasteurization kills the botulism bacteria, but the possibility of hardy botulism spores surviving and reactivating once cool remains a concern as with many preserved foods, however processed. For that reason, Baldwin's treatise specifies precise chilling requirements for "cook-chill", so that the botulism spores do not have the opportunity to grow or propagate. Extra precautions need to be taken for food to be eaten by people with compromised immunity. Women eating food cooked sous vide while pregnant may expose risk to themselves and/or their fetus and thus may choose to be more careful than usual.
Sous-vide's failure to penetrate the home kitchen is in part because of expense. A typical water-bath for the domestic cook, will cost £499, along with a good vacuum-sealer too cost at least £50.
One limitation of sous-vide cooking is the fact that browning (Maillard reactions) happens at much higher temperatures (above the boiling point of water). The flavors and "crust" texture developed by browning are generally seen as very desirable in the cooking of certain types of meat, such as a steak. The flavors and texture produced by browning cannot be obtained with only the sous-vide technique. In many cases, meats and other foods cooked with the sous-vide technique will be browned either before or after being placed in the water bath, using techniques such as grilling or searing on an extremely hot pan.

The whole process of cooking a ready meal in a bag for longterm storage offers the convenience to quickly heat up, eat and go ready meal. While the equipment is relatively expensive, its seems that this type of cooking maybe left for the convenience of large restaurants and emergency food packs for hurricane evacuees. Comparing the effort of a typical weekly shop to buying preparing and freezing enough food for the week, seems to be too much of a chore for Sous vide. Also the novel factor of this type of cooking would probably wear off in a few months, and would just seem like boil in the bag cooking. A simple task synonymous for boiling rice or cheap ready meals for a students with a budget...



Monday 24 December 2012

The science of Santa, or the time dialation of giving


Santa Claus, also known as Saint Nicholas, Father Christmas and simply "Santa", is a figure with legendary, mythical, historical and folkloric origins.
If we assume that Santa has to travel 510,000,000km on Christmas Eve, and that he has 32 hours to do it (see here for the reasoning behind those numbers), then Santa will be travelling at 10,703,437.5km/hr, or about 1,800 miles per second, all night (assuming he never stops: some sort of sleigh-mounted present-launcher will be required to shoot gifts down chimneys while moving.
Also assuming that each of these 91.8 million stops are evenly distributed around the earth (which, of course, we know to be false but for the purposes of our calculations we will accept), we are now talking about .78 miles per household, a total trip of 75½ million miles, not counting stops to do what most of us must do at least once every 31 hours, plus feeding and etc.
This means that Santa’s sleigh is moving at 650 miles per second, 3000 times the speed of sound. For purposes of comparison, the fastest manmade vehicle on earth, the Ulysses space probe, moves at a poky 27.4 miles per second a conventional reindeer can run, tops, 15 miles per hour.
The payload on the sleigh adds another interesting element. Assuming that each child gets nothing more than a medium-sized Lego set (2 pounds), the sleigh is carrying 321,300 tons, not counting Santa, who is invariably described as overweight. On land, conventional reindeer can pull no more than 300 pounds. Even granting that "flying reindeer" could pull TEN times their normal amount, we cannot do the job with eight, or even nine. We need 214,200 reindeer. This increases the payload not even counting the weight of the sleigh to 353,430 tons.
353,000 tons traveling at 650 miles per second creates enormous air resistance this will heat the reindeer up in the same fashion as a spacecraft re-entering the earth’s atmosphere. The lead pair of reindeer with absorb 14.3 QUINTILLION joules of energy. Per second. Each. In short, they will burst into flame almost instantaneously, exposing the reindeer behind them, and create deafening sonic booms in their wake. The entire reindeer team will be vaporized within 4.26 thousandths of a second. Santa, meanwhile, will be subjected to centrifugal forces 17,500.06 times greater than gravity. A 250-pound Santa (which seems ludicrously slim) would be pinned to the back of his sleigh by 4,315,015 pounds of force.
While conventional physics explain the impossibility, unconventional science fiction might have hold some clues. Obtaining the speeds that is required to be achieved in one night might, one would assume each task of gift giving is shorten due to the time constraints. Instead it is likely that the task maybe performed as we see it as magical, but in theory be a time dilation.

In the theory of relativity, time dilation is an actual difference of elapsed time between two events as measured by observers either moving relative to each other or differently situated from gravitational masses.While time is moving forward conventionally on earth. Santa may experience the effects a time dilation that will allow him to travel and become invisible to most people. Also not excluding the fact that an improbability drive, can theoretically allow the traveler the unique ability to appear in anywhere in the universe or local depending on the drive characteristics.
 While limited in our knowledge of dimensional space the likely hood of carrying toys in Santa's sack. It seems impossible to explain the 850,000 toys that needs to be distributed over the world. While 3D printers are a reality, its most likely that toys are possibly manufactured in transit. The other explanation is that dimensional space can allow a pocket universe to exist in a relatively small area with only a doorway between the two universes. The Tardis theory will allow the comfort of travel and infinite space for gift storage. Transdimesional engineering may explain the possibility of storage, it can also be the key to enter someones home.  As portals can be expanded to the point at which a human can enter. It can also be the reduced to pass through chimneys or small holes in the wall. AS we grow into Adulthood, we are likely to explain away the unrealistics of magic as a fictional possibility. Science may have a small grasp of what is or not possible, but the truth is that most people like the unknown and would use faith to continue the fictional possibility. If you believe in Santa claus or not, just make sure you have a good time this Christmas..... Merry Christmas everyone from MYDOGISDEAD...



Sunday 23 December 2012

The History of Toys, fun through the ages


The origin of toys is prehistoric; dolls representing infants, animals, and soldiers, as well as representations of tools used by adults are readily found at archaeological sites. The origin of the word "toy" is unknown, but it is believed that it was first used in the 14th century.
Toys excavated from the Indus valley civilization (3000-1500 BCE) include small carts, whistles shaped like birds, and toy monkeys which could slide down a string.
The Indus civilization seems to have been known as "Meluhha" to the early Sumerians in Mesopotamia (dates given are rarely precisely known and should be regarded as rough approximations). Among the ruins of Harappa this model of an ox cart was found. Such objects are not rare in the Indus civilization and they are usually assumed to have been toys. But but we know so little about the Indus civilization that they might just as well have been a ritual objects or offerings to the Gods. Whatever the purpose of these "toys", they do show us (in the absence of any excavated full-scale cart)s the sophisticated technology of this very early civilization. Goods were transported on such and similar carts.
Another example is a chalk figurine that was probably a favourite possession of the three year old, and placed next to the child when they died in the late Bronze Age or early Iron Age, around 3,000 years ago. Archaeologists who discovered the grave, where the child was laying on his or her side, believe the toy - perhaps placed there by a doting father - is the earliest known depiction of a hedgehog in British history. The diggers were working to the west of Stonehenge in what is known as the Palisade Ditch when they made the remarkable discovery last month in the top of the pit in which the child was buried. Archaeologist Dennis Price said: 'It is not difficult to envisage the raw emotion and harrowing grief that would have accompanied the death of this child. 'Amid the aura of gloom that surrounds Stonehenge, it comes as a beam of light to find a child's toy lovingly placed with the tiny corpse to keep him or her company through eternity.
 Egyptian children also played with dolls, although they were much more sophisticated then chalk figurines. Dolls of the time had wigs and movable limbs which were made from stone, pottery, and wood. In Ancient Greece and Ancient Rome, children played with dolls made of wax or terracotta, sticks, bows and arrows, and yo-yos. When Greek children, especially girls, came of age it was customary for them to sacrifice the toys of their childhood to the gods. On the eve of their wedding, young girls around fourteen would offer their dolls in a temple as a rite of passage into adulthood.
As technology changed and civilization progressed, toys also changed. Where as ancient toys were made from materials found in nature like stone, wood, and grass, modern toys are often made from plastic, cloth, and synthetic materials, often times powered by batteries. Ancient toys were often made by the parents and family of the children who used them, or by the children themselves. Modern toys, in contrast, are often mass-produced and sold in stores.
By the early 20th century there were dolls that could say "mama". Today there are computerized dolls that can recognize and identify objects, the voice of their owner, and choose among hundreds of pre-programmed phrases with which to respond. The materials that toys are made from have changed, what toys can do has changed, but the fact that children play with toys has not changed.
Toys, serve multiple purposes in both humans and animals. They provide entertainment while fulfilling an educational role. Toys enhance cognitive behavior and stimulate creativity. They aid in the development of physical and mental skills which are necessary in later life. One of the simplest toys, a set of simple wooden blocks is also one of the best toys for developing minds. Andrew Witkin, director of marketing for Mega Brands told Investor's Business Daily that, "They help develop hand-eye coordination, math and science skills and also let kids be creative." Other toys like Marbles, jackstones, and balls serve similar functions in child development.
While most toys serves as a some form of education for mind development, the history of toys indicate that imagination seems to play a large part. While Christmas is around the corner its fair to suggest that toys is on the minds of most people, hopefully everyone gets what they wished for...