Saturday, May 05, 2007
Half mouse brain simulated on supercomputer
Researchers from the IBM Almaden Research Lab and the University of Nevada have used the BlueGene L supercomputer to model half a virtual mickey mouse brain.
8 million neurons, up to 6,300 synapses or connections with other nerve fibers: a challenge for simulation.
Researchers explain that such a modeling initiative puts tremendous constraints on computation, communication and memory capacity of any computing platform. The BlueGene L supercomputer was used to run the complex simulation for 10 seconds at a speed that was 10 times slower than real life. Useless but impressive...
Source: BBC News, last week.
For those who do not know:
- A supercomputer is a computer that leads the world in terms of processing capacity, particularly speed of calculation, at the time of its introduction. The term "Super Computing" was first used by New York World newspaper in 1920 to refer to large custom-built tabulators IBM made for Columbia University. According to this website, BlueGene L is still number 1 but Baker is coming its way...
- Chemical synapses are specialized junctions through which the cells of the nervous system signal to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow the neurons of the central nervous system to form interconnected neural circuits. They are thus crucial to the biological computations that underlie perception and thought. They provide the means through which the nervous system connects to and controls the other systems of the body. A chemical synapse between a motor neuron and a muscle cell is called a neuromuscular junction; this type of synapse is well-understood.
8 million neurons, up to 6,300 synapses or connections with other nerve fibers: a challenge for simulation.
Researchers explain that such a modeling initiative puts tremendous constraints on computation, communication and memory capacity of any computing platform. The BlueGene L supercomputer was used to run the complex simulation for 10 seconds at a speed that was 10 times slower than real life. Useless but impressive...
Source: BBC News, last week.
For those who do not know:
- A supercomputer is a computer that leads the world in terms of processing capacity, particularly speed of calculation, at the time of its introduction. The term "Super Computing" was first used by New York World newspaper in 1920 to refer to large custom-built tabulators IBM made for Columbia University. According to this website, BlueGene L is still number 1 but Baker is coming its way...
- Chemical synapses are specialized junctions through which the cells of the nervous system signal to each other and to non-neuronal cells such as those in muscles or glands. Chemical synapses allow the neurons of the central nervous system to form interconnected neural circuits. They are thus crucial to the biological computations that underlie perception and thought. They provide the means through which the nervous system connects to and controls the other systems of the body. A chemical synapse between a motor neuron and a muscle cell is called a neuromuscular junction; this type of synapse is well-understood.
Labels: exploits
Respectful Camera
Computer scientists from UCB (University of California, Berkeley) have developed a "respectful camera" a new type of video surveillance that covers a person's face with an oval for privacy but that can remove the oval for an investigation!
Well, the technology is still in the research phase, as it's only possible to cover someone's face if that person is wearing a marker such as a green vest or a yellow hat (!!). Anyway, this idea seems to be a good compromise between privacy advocates and those concerned about security.
The researchers use a statistical classification approach called adaptive boosting (or AdaBoost) to teach the system to identify the marker in a visually complicated environment, and added a tracker to compensate for the subject's velocity and other interframe information. During the tests, the marker has been correctly identified 96% of the time ... so 4% of our face on the screen with a yellow hat! Which can seem quite useless but it's just the beginning, the researchers expect much more progress... And Ken Goldberg, the big boss, think to use a less conspicuous marker, like a button...
Source: TechnologyReview, wednesday.
For those who do not know: Boosting is a machine learning meta-algorithm for performing supervised learning. Boosting occurs in stages, by incrementally adding to the current learned function. At every stage, a weak learner (i.e., one that has an accuracy only slightly greater than chance) is trained with the data. The output of the weak learner is then added to the learned function, with some strength (proportional to how accurate the weak learner is). Then, the data is reweighted: examples that the current learned function gets wrong are "boosted" in importance, so that future weak learners will attempt to fix the errors. There are several different boosting algorithms, depending on the exact mathematical form of the strength and weight. AdaBoost is a popular and the historically most significant boosting algorithms, whereas more recent algorithms such as LPBoost and TotalBoost have replaced AdaBoost because they converge much faster and produce sparser hypothesis weightings. Most boosting algorithms fit into the AnyBoost framework, which shows that boosting performs gradient descent in function space...
Well, the technology is still in the research phase, as it's only possible to cover someone's face if that person is wearing a marker such as a green vest or a yellow hat (!!). Anyway, this idea seems to be a good compromise between privacy advocates and those concerned about security.
The researchers use a statistical classification approach called adaptive boosting (or AdaBoost) to teach the system to identify the marker in a visually complicated environment, and added a tracker to compensate for the subject's velocity and other interframe information. During the tests, the marker has been correctly identified 96% of the time ... so 4% of our face on the screen with a yellow hat! Which can seem quite useless but it's just the beginning, the researchers expect much more progress... And Ken Goldberg, the big boss, think to use a less conspicuous marker, like a button...
Source: TechnologyReview, wednesday.
For those who do not know: Boosting is a machine learning meta-algorithm for performing supervised learning. Boosting occurs in stages, by incrementally adding to the current learned function. At every stage, a weak learner (i.e., one that has an accuracy only slightly greater than chance) is trained with the data. The output of the weak learner is then added to the learned function, with some strength (proportional to how accurate the weak learner is). Then, the data is reweighted: examples that the current learned function gets wrong are "boosted" in importance, so that future weak learners will attempt to fix the errors. There are several different boosting algorithms, depending on the exact mathematical form of the strength and weight. AdaBoost is a popular and the historically most significant boosting algorithms, whereas more recent algorithms such as LPBoost and TotalBoost have replaced AdaBoost because they converge much faster and produce sparser hypothesis weightings. Most boosting algorithms fit into the AnyBoost framework, which shows that boosting performs gradient descent in function space...
Labels: anecdote
archives >> April - March - February - January -December - November - October - September - August - July - June - May
Powered by Stuff-a-Blog
une page au hasard