Saturday, May 05, 2007

 

Respectful Camera

anecdote


Computer scientists from UCB (University of California, Berkeley) have developed a "respectful camera" a new type of video surveillance that covers a person's face with an oval for privacy but that can remove the oval for an investigation!

Well, the technology is still in the research phase, as it's only possible to cover someone's face if that person is wearing a marker such as a green vest or a yellow hat (!!). Anyway, this idea seems to be a good compromise between privacy advocates and those concerned about security.

The researchers use a statistical classification approach called adaptive boosting (or AdaBoost) to teach the system to identify the marker in a visually complicated environment, and added a tracker to compensate for the subject's velocity and other interframe information. During the tests, the marker has been correctly identified 96% of the time ... so 4% of our face on the screen with a yellow hat! Which can seem quite useless but it's just the beginning, the researchers expect much more progress... And Ken Goldberg, the big boss, think to use a less conspicuous marker, like a button...


Source: TechnologyReview, wednesday.

For those who do not know: Boosting is a machine learning meta-algorithm for performing supervised learning. Boosting occurs in stages, by incrementally adding to the current learned function. At every stage, a weak learner (i.e., one that has an accuracy only slightly greater than chance) is trained with the data. The output of the weak learner is then added to the learned function, with some strength (proportional to how accurate the weak learner is). Then, the data is reweighted: examples that the current learned function gets wrong are "boosted" in importance, so that future weak learners will attempt to fix the errors. There are several different boosting algorithms, depending on the exact mathematical form of the strength and weight. AdaBoost is a popular and the historically most significant boosting algorithms, whereas more recent algorithms such as LPBoost and TotalBoost have replaced AdaBoost because they converge much faster and produce sparser hypothesis weightings. Most boosting algorithms fit into the AnyBoost framework, which shows that boosting performs gradient descent in function space...

Labels:


Comments: Post a Comment



archives >> April - March - February - January -December - November - October - September - August - July - June - May


Powered by Stuff-a-Blog
une page au hasard

This page is powered by Blogger. Isn't yours?