Showing posts with label DARPA. Show all posts
Showing posts with label DARPA. Show all posts

Monday, 19 November 2012

Cognitive Computing, a brief description


The brain of a vertebrate is the most complex organ of its body. In a typical human the cerebral cortex (the largest part) is estimated to contain 15 to 33 billion neurons, each connected by synapses to several thousand other neurons. These neurons communicate with one another by means of long protoplasmic fibers called axons, which carry trains of signal pulses called action potentials to distant parts of the brain or body targeting specific recipient cells. Much of what we know of the  brain is based on factual data from observed workings from MRI scans or post-mortem examination.

In many philosophies, the conscious mind is considered to be a separate entity, existing in a parallel realm not described by physical law. Some people claim that this idea gains support from the description of the physical world provided by quantum mechanics. A recent theory of cognitive computation trys to merge silicon based computer model with human like cognitive thinking. Under the paradigm called Memory-prediction framework.
The memory-prediction framework is a theory of brain function that was created by Jeff Hawkins and described in his 2004 book On Intelligence. This theory concerns the role of the mammalian neocortex and its associations with the hippocampus and the thalamus in matching sensory inputs to stored memory patterns and how this process leads to predictions of what will happen in the future.
The memory-prediction framework provides a unified basis for thinking about the adaptive control of complex behavior. Although certain brain structures are identified as participants in the core 'algorithm' of prediction-from-memory, these details are less important than the set of principles that are proposed as basis for all high-level cognitive processing.
The central concept of the memory-prediction framework is that bottom-up inputs are matched in a hierarchy of recognition, and evoke a series of top-down expectations encoded as potentiations. These expectations interact with the bottom-up signals to both analyse those inputs and generate predictions of subsequent expected inputs. Each hierarchy level remembers frequently observed temporal sequences of input patterns and generates labels or 'names' for these sequences.
 When an input sequence matches a memorized sequence at a given layer of the hierarchy, a label or 'name' is propagated up the hierarchy - thus eliminating details at higher levels and enabling them to learn higher-order sequences. This process produces increased invariance at higher levels. Higher levels predict future input by matching partial sequences and projecting their expectations to the lower levels. However, when a mismatch between input and memorized/predicted sequences occurs, a more complete representation propagates upwards. This causes alternative 'interpretations' to be activated at higher levels, which in turn generates other predictions at lower levels.
In other words a brain will recognize a pattern from its sensors and will remember it in future events, until a new set of patterns creates extra memories allowing the brain to act appropriately to whatever patterns it sees. But unlike the new model of thinking the human brain will have ethics and personnel tastes, or the ability to back out of something based on hunches and personnel comfort.


Traditionally a computer does its work sequentially for the most part and is run by a clock. The clock, like a conductor in a band that drives every instruction and piece of data to its next location. As clock rates increase to drive data faster, power consumption goes up dramatically, and even at rest these machines need a lot of electricity. More importantly, computers have to be programmed. They are hard wired and fault prone. They are good at executing defined algorithms and performing analytics. With $41 million in funding from the Defense Advanced Research Projects Agency (DARPA), the scientists at the Almaden lab set out to make a brain in a project called Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE).
IBM's "neurosynaptic computing chip" features a silicon core capable of digitally replicating the brain's neurons, synapses and axons. To achieve this, researchers took a dramatic departure from the conventional von Neumann computer architecture, which links internal memory and a processor with a single data channel. This structure allows for data to be transmitted at high, but limited rates, and isn't especially power efficient especially for more sophisticated, scaled-up systems. Instead, IBM integrated memory directly within its processors, wedding hardware with software in a design that more closely resembles the brain's cognitive structure. This severely limits data transfer speeds, but allows the system to execute multiple processes in parallel (much like humans do), while minimizing power usage. IBM's two prototypes have already demonstrated the ability to navigate, recognize patterns and classify objects, though the long-term goal is to create a smaller, low-power chip that can analyze more complex data to learn.

IBM project led by Dharmendra S Modha with researchers from five universities and the Lawrence Berkeley National Laboratory. Dubbed “Blue Matter,” a software platform for neuroscience modeling, it pulls together archived magnetic resonance imaging (MRI) scan data and assembles it on a Blue Gene/P Supercomputer. IBM has essentially simulated a brain with 1 billion neurons and 10 trillion synapses  one they claim is about the equivalent of a cat’s cortex, or 4.5% of a human brain by May 2009.
Only recently this year, IBM says it has now accomplished this milestone with its new “TrueNorth” system running on the world’s fastest operating supercomputer, the Lawrence Livermore National Lab (LBNL) Blue Gene/Q Sequoia, using 96 racks (1,572,864 processor cores, 1.5 PB memory, 98,304 MPI processes, and 6,291,456 threads). IBM and LBNL achieved an unprecedented scale of 2.084 billion neurosynaptic cores* containing 53×1010 (530 billion) neurons and 1.37×1014 (100 trillion) synapses running only 1542 times slower than real time.

A network with over 2 billion of these neurosynaptic cores that are divided into 77 brain-inspired regions with probabilistic intra-region, which doubles the brain power of "Blue Matter". This new way of short parallel processing will give give rise to intelligent security systems or cars which will assist the driver in navigation light or one day help drive. Cognitive computing is still in its infancy but considering that a possible 86 billion neurons are used in an average brain, the artificial brain will need about 86 large rooms filled with "Blue matter" supercomputers to be a human equivalent. Perhaps the new way of cognitive computing will speed up the singularity or the merging man and machine by reverse engineering the thought processes of a brain. The task now is to further shrink each billion silicon neuron connections to a size no bigger then a small melon and with the energy consumption of a light bulb. With this in mind the likelihood of intelligent robots in the future is possible. As well as that there will need to be a Asimov's no kill rule and probably a rethink of what a soul is...


Monday, 10 September 2012

The Cyborgs are coming or are they here?

The more strict definition of Cyborg is almost always considered as increasing or enhancing normal capabilities. While cyborgs are commonly thought of as mammals, they might also conceivably be any kind of organism and the term "Cybernetic organism" has been applied to networks, such as road systems, corporations and governments, which have been classed as such.
The term can also apply to micro-organisms which are modified to perform at higher levels than their unmodified counterparts. In medicine, there are two important and different types of cyborgs: the restorative and the enhanced. Restorative technologies "restore lost function, organs, and limbs". The key aspect of restorative cyborgization is the repair of broken or missing processes to revert to a healthy or average level of function. There is no enhancement to the original faculties and processes that were lost. On the contrary, the enhanced cyborg "follows a principle, and it is the principle of optimal performance: maximising output (the information or modifications obtained) and minimising input (the energy expended in the process)". Thus, the enhanced cyborg intends to exceed normal processes or even gain new functions that were not originally present.
 Military organizations' research has recently focused on the utilisation of cyborg animals for the purposes of a supposed tactical advantage. DARPA has announced its interest in developing "cyborg insects" to transmit data from sensors implanted into the insect during the pupal stage. The insect's motion would be controlled from a Micro-Electro-Mechanical System (MEMS) and could conceivably survey an environment or detect explosives and gas.

Cornell University researchers have succeeded in implanting electronic circuit probes into tobacco hornworms as early pupae. The hornworms pass through the chrysalis stage to mature into moths whose muscles can be controlled with the implanted electronics. The pupal insertion state is shown in insert "i" in the picture seen above. The successful emergence of a microsystem-controlled insect is shown in insert "ii;" the microsystem platform is shown held with tweezers. The X-ray image (A) shows the probes inserted into the dorsoventral and dorsolongitudinal flight muscles. CT images (B) show components of high absorbance indicating tissue growth around the probe.

The research also indicated the most favorable and least favorable times for insertion of control devices. The overall size of the circuit board is 8x7 mm, with a total weight of about 500 mg. The capacity of the battery is 16 mAh, and weighs 240 mg. A driving voltage of 5 volts causes the tobacco hornworm blade muscles (two pairs) to move for flight and maneuvering. DARPA HI-MEMS program director Amit Lal credits science fiction writer Thomas Easton with the idea. Lal read Easton's 1990 novel Sparrowhawk, in which animals enlarged by genetic engineering were outfitted with implanted control systems. Dr. Easton, a professor of science at Thomas College, sees a number of applications for HI-MEMS insects. Moths are extraordinarily sensitive to sex attractants, so instead of giving bank robbers money treated with dye, they could use sex attractants instead. Then, a moth-based HI-MEMS could find the robber by following the scent." "[Also,] with genetic engineering Darpa could replace the sex attractant receptor on the moth antennae with receptors for other things, like explosives, drugs or toxins," said Easton.
Artificial tissue can already be grown on three-dimensional scaffolds made of biological materials that are not electrically active. And electrical components have been added to cultured tissue before, but not integrated into its structure, so they were only able to glean information from the surface. A research team combined these strands of work to create electrically active scaffolds. They created 3D networks of conductive nano-wire studded with silicon sensors. Crucially, the wires had to be flexible and extremely small, to avoid impeding the growth of tissue. The scaffold also contained traditional biological materials such as collagen.

The researchers were able to grow rat neurons, heart cells and muscle in these hybrid meshes. In the case of the heart cells, they started to contract just like normal cells, and the researchers used the network to read out the rate of the beats. When they added a drug that stimulates heart cell contraction, they detected an increase in the rate, indicating the tissue was behaving like normal and that the network could sense such changes.
The team also managed to grow an entire blood vessel about 1.5 centimetres long from human cells, with wires snaking through it. By recording electrical signals from inside and outside the vessel– something that was never possible before– the team was able to detect electrical patterns that they say could give clues to inflammation, whether tissue has undergone changes that make it prone to tumour formation or suggest impending heart disease.

Possible uses for this type bio electrical engineering is that you could use these things to directly measure the effects of drugs in synthetically grown human tissue without ever having to test them in an actual human being. So far, though, the researchers have only used the electrical scaffolds to record signals– they have yet to feed commands to cells. The next step could be to add components to the nanoscaffold that could "talk" to neurons.
Researchers in Chicago have gone even further. The Neural Engineering Center for Artificial Limbs has developed techniques that combine myo-electric limbs with nerve transplants to deliver even finer motor control, with patients even being able to feel the objects they grip or touch.

Our merger with machines is already happening. We replace hips and other parts of our bodies with titanium and steel parts. More than 50 000 people have tiny computers surgically implanted in their heads with direct neural connections to their cochleas to enable them to hear. In the testing stage, there are retina microchips to restore vision and motor implants to give quadriplegics the ability to control computers with thought. Robotic prosthetic legs, arms, and hands are becoming more sophisticated. I don't think I'll live long enough to get a wireless Internet brain implant, but my kids or their kids might.

And then there are other things still further out, such as drugs and genetic and neural therapies to enhance our senses and strength. While we become more robotic, our robots will become more biological, with parts made of artificial and yet organic materials. In the future, we might share some parts with our robots which will lead into prolonging life. From the simple idea of replacing limbs to complete body repair, technology has a realistic claim to make a human cyborg. Granted it still in the early stages of bio electrical engineering, but with every technological advance grows the appeal of artificial improvement. Could this appeal grow over the next few decades and reach a consumer market?. The cost of artificial eye a Argus II implant is around $115,000 which is a high price for a 60 pixel screen on your eyeball. But maybe that price will drop down and with better interconnectivity or Augmented reality software, these cyborgs will most likely have a unfair advantage over the rest of us humans.