We compile citations and summaries of about 400 new articles every week.
RSS Feed

HELP: Tutorials | FAQ
CONTACT US: Contact info

Search Results

Journal Article


Tan Z, Yang Y, Wan J, Wan H, Guo G, Li SZ. IEEE Trans. Image Process. 2019; ePub(ePub): ePub.


(Copyright © 2019, IEEE (Institute of Electrical and Electronics Engineers))






Recognizing the pedestrian attributes in surveillance scenes is an inherently challenging task, especially for the pedestrian images with large pose variations, complex backgrounds and various camera viewing angles. To select important and discriminative regions or pixels against the variations, three attention mechanisms are proposed, including parsing attention, label attention and spatial attention. Those attentions aim at accessing the effective information from different perspectives. To be specific, the parsing attention extracts discriminative features by learning not only where to turn attention to, but also how to aggregate features of different semantic human bodies, e.g. head and upper body. The label attention aims at targetedly collecting the discriminative features for each attribute. Different from the parsing and label attention mechanisms, the spatial attention considers the problem from a global perspective, aiming at selecting several important and discriminative image regions or pixels for all attributes. Then, we propose a joint learning framework formulated in a multi-task-like way with these three attention mechanisms learnt concurrently to extract complementary and correlated features. This joint learning framework is named as Joint Learning of Parsing attention, Label attention and Spatial attention for Pedestrian Attributes Analysis (JLPLS-PAA, for short). Extensive comparative evaluations conducted on multiple large-scale benchmarks including PA-100K, RAP, PETA, Market-1501 and Duke attribute datasets further demonstrate the effectiveness of the proposed JLPLS-PAA framework for pedestrian attribute analysis.

Language: en


All SafetyLit records are available for automatic download to Zotero & Mendeley