Sensing and Modeling Human Networks

And from the INSA list: “[…] the first experiment in learning the face-to-face communication patterns of a large group by equipping the people within the community with wearable sensing devices. The main contribution of this thesis is to have demonstrated the feasibility of learning social interactions from raw sensory data. In this thesis we have presented a framework for automatic modeling of face-to-face interactions, starting from the data collection methods and working up to computational methods for learning the structure and dynamics of social networks.” I haven’t read this yet. Looks interesting.

2 comments

  1. There was something puzzling me about this paper and your previous post … and now i realize that it’s the lack of “touch” awareness in the sensor devices. OK, we know the people are close but we don’t know if they actually touched! Handshake anyone? What about a kiss? I guess you can extrapolate from the data … in the same way you can use Google to smell the coffee.

    I suppose if there were a store in SoHo selling haptic socialnet outfits it would be called “Stay In Touch”

  2. well yes, it was the contrast of the two on the same day. the sensor thing is asking for being strapped to a dog/panda/albatross, and being set free to generate interesting cross-connections…

Comments are closed.