The human brain could have the capacity to carry as much information in its memory as is included on the whole Internet, new research indicates.Researchers found that, unlike a classical computer that codes advice as 0s and 1s, a brain cell uses 26 different approaches to code its “bits.” They computed the brain could keep 1 petabyte (or a quadrillion bytes) of information.”This is really a bombshell in the area of neuroscience,” Terry Sejnowski, a biologist in the Salk Institute in La Jolla, California, said in a statement. “Our new measurements of the brains memory capacity raise conservative estimates by a factor of 10.”Awesome computerWhat is more, the human brain can keep this mind boggling quantity of info while sipping only enough power to run a subdued light bulb. [Top 10 Mysteries of the Mind]By comparison, a computer with similar memory and processing power would need 1 gigawatt of power, or “essentially a entire nuclear power station to run one computer that does what our ‘computer’ does with 20 watts,” said study coauthor Tom Bartol, a neuroscientist in the Salk Institute.In particular, the team needed to take a closer look at the hippocampus, a brain area that plays a vital part in learning and short-term memory.To untangle the mysteries of the head, the research team took a teensy piece of a rat’s hippocampus, put it in embalming fluid, then sliced it thinly with an incredibly sharp diamond knife, a procedure similar to “slicing an orange,” Bartol said. (Though a rat’s brain isn’t identical to a human brain, the fundamental anatomical characteristics and function of synapses are much the same across all mammals.) The team subsequently embedded the thin tissue into plastic, looked at it under a microscope and created digital images.Next, researchers spent one year tracing, with pencil and paper, every kind of cell they saw. After all that effort, the team had followed all of the cells in the sample, a staggeringly miniature volume of tissue. [Image Gallery: Einstein’s Brain]Size distributionThen, the team counted up all the whole neurons, or brain cells, in the tissue, which totaled 450. Of that number, 287 had the whole constructions the researchers were interested in.Neurons seem somewhat like bloated, misshapen balloons, with long tendrils called axons and dendrites snaking out from the cell body. Axons behave as the brain cell’s output signal cable, sending out a flurry of molecules called neurotransmitters, while tiny spines on dendrites get the chemical messages sent by the axon across a narrow opening, known as the synapse. (The particular area on the dendrite at which these chemical messages are transmitted from the other side of the synapse is known as the dendritic back.) The receiving brain cell can then fire out its own cache of neurotransmitters to relay that message to other neurons, though most frequently, it does nothing in response.Previous work had demonstrated that the largest synapses dwarf the lowest ones by a factor of 60. That size difference represents the strength of the inherent link while the typical neuron relays incoming signs about 20 percent of the time, that percentage can rise over time. The more a brain circuit gets a work out (in other words, the more one system of neurons is activated), the higher the chances are that one neuron in that circuit will fire when another sends it a sign. The method of strengthening these neural networks appears to enlarge the physical point of contact in the synapses, raising the quantity of neurotransmitters they are able to release, Bartol said.If neurons are basically chattering to every other across a synapse, then a brain cell conveying across a larger synapse has a louder voice than one conveying across a smaller synapse, Bartol said.But scientists have not understood much about how many sizes of neurons there were and how they altered in response to signs.Subsequently Bartol, Sejnowski and their co-workers found something amusing in their own hippocampal slice. About 10 percent of the time, one axon snaked out and linked to the exact same dendrite at two distinct dendritic spines. These oddball axons were sending the same input signal to every one of the areas on the dendrite, yet the sizes of the synapses, where axons “speak” to dendrites, changed by a mean of 8 percent. That meant the natural variability in how much a message between the two changed the underlying synapse was 8 percent.So that the team subsequently inquired: If synapses can differ in size by a factor of 60, as well as the magnitude of a synapse changes by about 8 percent due to pure chance, how many various kinds of synaptic sizes could fit within that size range and be found as distinct by the mind?By combining that information with signal-detection theory, which dictates how different two signs have to be before the brain can find a difference between them, the researchers discovered that neurons could come in 26 distinct size ranges. This, basically, disclosed how many distinct volumes of “voices” neurons use to chatter with each other. Previously, researchers believed that these brain cells came in just a couple of sizes.From that point, they could compute just how much information may be carried between any two neurons. Computers save information as bits, which can have two prospective values 0 or 1. But that binary message from a neuron (to fire or not) can create 26 distinct sizes of neurons. So they used fundamental information theory to compute just how many bits of data each neuron can hold.”To convert the number 26 into units of bits we just say 2 raised to the n power equals 26 and solve for n. In this situation n equals 4.7 bits,” Bartol said.That storage capacity translates to about 10 times what was formerly considered, the researchers reported on the internet in the journal eLife.Very efficientThe brand new findings also shed light on the way in which the brain stores information while staying reasonably active. The reality that most neurons do not fire in response to incoming signals, but the body is exceptionally accurate in translating those signs into the physical structures, explains in part why the brain is more efficient than a computer: Most of its heavy lifters aren’t doing anything most of the time.Nevertheless, even whenever typical brain cell is inactive 80 percent of the time, that still does not clarify why a computer needs 50 million times more energy to do the same jobs as a human brain.”The other portion of the narrative might need to do with how biochemistry works equated to how electrons function in a computer. Computers are utilizing electrons to do the computations and electrons flowing in a cable make lots of heat, and that heat is wasted energy,” Bartol said. Biochemical pathways may just really be much more efficient, he added.