The rapid progress in the development of low-level component technologies such as wearable cameras, wearable sensors, wearable displays and wearable computers is making it possible to augment everyday living. Wearable and egocentric vision systems can be exploited to analyze multi-modal data types (e.g. video, audio, motion) and to support understanding human interactions with the world (including gesture recognition, action recognition, social interaction recognition). Based on the processing of such data, wearable systems can be used to enhance our capabilities and augment our perception. State-of-the-art techniques for wearable sensing can support assistive technologies and advanced perception. This special issue intends to highlight research in support for human performance through egocentric sensing .