IN GOD WE TRUST.

Home
Publications
>>AliCHI<<
NgramCNN
Projects
Dataset&Tools
Courses
Contact
Open Datasets!



Tag Cloud


ClustrMaps




  Intro

AliCHI

AliCHI is a large-scale multi-modal face-to-face human conversation dataset.

Download here

The open_data folder contains 100 group folders, each group folder contains multiple folders named "topicXX", each group represents two people involved in the conversation, each topic folder represents a conversation session of these two people. Each topic folder contains 12 files, which are the language annotation, head behavior annotation, eye behavior annotation, expression annotation, eyebrow annotation and hand annotation of the two participants(A and B) of the conversation. For verbal data, we annotated the gender of the speaker, the content of the speech and its corresponding timestamps. We also annotated whether the speaking moment contained abnormal sounds in the environment, and whether there are overlapping speech, which can also be calculated from the annotation data of another speaker. For non-verbal (behavioral) data, we annotated behavioral labels and their start and end timestamps.

This dataset is released for research only. Contact us if you have any questions or suggestions.

Zhiling Luo (godot.lzl@alibaba-inc.com)




Last change: July 20, 2017