Human-like AI

Towards more natural, interactive, personalized, and human-inspired AI systems. Seamless interaction between humans and AI in Multi-modal perception, Multi-modal instruction, Personalized interaction and responses, Complex control: navigation, reasoning, etc.

AI-systems that communicate and collaborate seamlessly with humans

Icoon van Human-like AI

This research line focuses on the design of AI systems to communicate and interact with people as naturally as possible, including through natural language and image recognition. The AI ​​system must be able to build layered human-like reasoning by observing and understanding the complex environment. This allows computers to independently identify and solve problems. Humans can work with their intelligence and physical abilities in harmony with complementary machines through intuitive and social interaction. This goal is still a long way off. But in many practical applications, such as recognizing patterns and generalizing individual tasks, AI already makes a very interesting contribution.

Structure of the challenge

Overview of the structure of the research challenge

WP1, WP2, WP3 and WP4 are work packages that feature both perception and cognition: starting from multimodal sensors, clear semantics are derived, which are used in human-like learning algorithms, with a strong personalization and interaction aspect. The results of these methodological work packages are integrated in WP5: use cases and demonstrators.

Following the objectives set forth above, this challenge can be split in two main parts, which are both addressed in two methodological work packages. The overall work package structure is illustrated in Figure above.

  • How can we get a detailed understanding of the environment at hand? This is tackled in work packages WP 1 and WP 2
  • How can we respond and react to tasks in a way that is similar to a human? This is tackled in WP 3 and WP 4.

Work packages

WP 1 Audio-visual perception & multimodal representations

WP 2 Deep learning-based conversational agents

WP 3 Interaction, personalization and recommendation

WP 4 Coginitive architectures & human-like learning

Contacts

Multiple research groups collaborate on this research domain. This table mentions the contact person and his/her affiliation.

Steven Latré

Management team, imec, UAntwerpen - IDLab, e-mail

Tom De Schepper

Management team & WP5 Lead: Use Cases, UAntwerpen - IDLab

Tinne Tuytelaars

WP1 Lead: Audio-visual Perception and Multimodal Representations, KU Leuven

Walter Daelemans

WP2 Lead: Deep Learning-based Conversational Agents, UAntwerpen - CLiPS

Bart Goethals

WP3 Lead: Interaction, Personalization and Recommendation, UAntwerpen - ADREM

Geraint Wiggins

WP4 Lead: Cognitive Architectures & Human-like learning, VUB