Без темы
<<  Vision 2050 Generic Presentation WAAS NOTAM SYSTEM UPDATE  >>
Visual Classification with Multi-Task Joint Sparse Representation
Visual Classification with Multi-Task Joint Sparse Representation
Problem to Study
Problem to Study
Motivations
Motivations
Related Work
Related Work
Our Method
Our Method
Joint Sparse Representation
Joint Sparse Representation
Formulation
Formulation
Different Mixed-Norms
Different Mixed-Norms
Different Mixed-Norms
Different Mixed-Norms
Optimization
Optimization
Classification
Classification
Algorithm
Algorithm
Kernel-View Extensions
Kernel-View Extensions
Algorithm
Algorithm
Column Generation
Column Generation
Experiments
Experiments
Data Sets
Data Sets
Results on Oxford flowers 17
Results on Oxford flowers 17
Results on Oxford flowers 17 (cont
Results on Oxford flowers 17 (cont
Results on Oxford flowers 102
Results on Oxford flowers 102
Results on Caltech 101
Results on Caltech 101
Conclusions
Conclusions
Thank you
Thank you
Visual Classification with Multi-Task Joint Sparse Representation
Visual Classification with Multi-Task Joint Sparse Representation

Презентация: «Visual Classification with Multi-Task Joint Sparse Representation». Автор: Markus Svens?n & Christopher M Bishop . Файл: «Visual Classification with Multi-Task Joint Sparse Representation.ppt». Размер zip-архива: 26333 КБ.

Visual Classification with Multi-Task Joint Sparse Representation

содержание презентации «Visual Classification with Multi-Task Joint Sparse Representation.ppt»
СлайдТекст
1 Visual Classification with Multi-Task Joint Sparse Representation

Visual Classification with Multi-Task Joint Sparse Representation

Authors: Xiaotong Yuan and Shuicheng Yan Presenter: Xiaotong Yuan

Learning & Vision Research Group, ECE, National University of Singapore

2 Problem to Study

Problem to Study

Task: combine multiple features for visual classification

Training based methods (MKL and SVM ensemble): classifier training + combination Our solution: cast feature combination to a multi-task joint sparse representation problem

Color Texture Shape

In most cases, multiple features

3 Motivations

Motivations

Advances in sparse representation (SR) for recognition (Wright et al., 2009) Robust Training free Advances in multi-task sparse learning Separate but related sparse learning tasks Joint sparsity (Zhang 2006, Liu et al., 2009)

4 Related Work

Related Work

Kernel Feature Combination Multiple Kernel Learning (Varma & Ray, 2007) Boost Individual SVM classifier (Gehler & Nowozin, 2009) Multi-task joint covariate selection (Obozinski et al., 2009) Group Lasso (Yuan & Lin, 2006) Multi-task Lasso (Zhang, 2006) Sparse representation (SR) for recognition (Wright et al., 2009)

5 Our Method

Our Method

MTJSRC: a Multi-Task Joint Sparse Representation and Classification method Utilizing each feature to form a linear representation task Jointly select representative images from few classes Boost the individual features to improve performance Extensions in Kernel-view

6 Joint Sparse Representation

Joint Sparse Representation

A set of images

K types of features

Objective: block-level sparse coefficients

Test Image

7 Formulation

Formulation

A supervised K-task linear representation problem

General formulation: multi-task least square regression with mixed-norm regularization:

Less sparse, convex

Sparser, non-convex

8 Different Mixed-Norms

Different Mixed-Norms

Joint sparsity-inducing (Obozinski et al., 2009)

Joint sparsity-inducing (Zhang, 2006)

K Independent SR tasks

K Independent ridge regression tasks

9 Different Mixed-Norms

Different Mixed-Norms

Joint sparsity-inducing (Obozinski et al., 2009)

Joint sparsity-inducing (Zhang, 2006)

K Independent SR tasks

K Independent ridge regression tasks

10 Optimization

Optimization

An Accelerated Proximal Gradient Method (Tseng, 2008) Generalized gradient mapping step Aggregation step

11 Classification

Classification

Optimal reconstruction coefficients:

Decision:

Feature confidence: learned via Linear Programming Boosting on validation set

12 Algorithm

Algorithm

Algorithm 1. Multi-Task Joint Sparse Representation Classification

Generalized gradient mapping

Aggregation

13 Kernel-View Extensions

Kernel-View Extensions

Multitask joint sparse representation in a RKHS

The APG optimization is characterized by inner product of feature vectors

training kernel matrix

testing kernel vector

14 Algorithm

Algorithm

Generalized gradient mapping

Aggregation

Algorithm 2. MTJSRC-RKHS.

15 Column Generation

Column Generation

MTJSRC-CG: take the columns of each kernel matrix as feature vectors,

Objective:

Decision:

16 Experiments

Experiments

Comparing feature combination algorithms Nearest Subspace + Combination Sparse Representation + Combination Representative kernel feature combination methods in literature (Nilsback, 2008/2009, Varma, 2007, Gehler, 2009)

17 Data Sets

Data Sets

Oxford flowers 17 7 Kernels (Nilsback, 2009) Oxford flowers 102 4 Kernels (Nilsback, 2008) Caltech 101 4 Kernels (Varma et al., 2007)

18 Results on Oxford flowers 17

Results on Oxford flowers 17

Accuracies by Feature Combination

Accuracies by Single Features

NS

SRC

MKL (Nilsback, 2009)

CG-Boost (Gehler, 2009)

LPBoost (Gehler, 2009)

MTJSRC-RKHS

MTJSRC-CG

83.2 ± 2.1

85.9 ± 2.2

88.2 ± 2.0

84.8 ± 2.2

85.4 ± 2.4

88.1 ± 2.3

88.9 ± 2.9

Features

NS

SVM (Gehler, 2009)

MTJSRC-RKHS (K=1)

MTJSRC-CG (K=1)

Color

61.7 ± 3.3

60.9 ± 2.1

64.0 ± 2.1

64.0 ± 3.3

Shape

69.9 ± 3.2

70.2 ± 1.3

72.7 ± 0.3

71.5 ± 0.8

Texture

55.8 ± 1.4

63.7 ± 2.7

67.6 ± 2.4

67.6 ± 2.2

HSV

61.3 ± 0.7

62.9 ± 2.3

64.7 ± 4.1

65.0 ± 3.9

HOG

57.4 ± 3.0

58.5 ± 4.5

61.9 ± 3.6

62.6 ± 2.7

SIFTint

70.7 ± 0.7

70.6 ± 1.6

74.0 ± 2.2

74.0 ± 2.0

SIFTbdy

61.9 ± 4.2

59.4 ± 3.3

62.4 ± 3.2

63.2 ± 33

19 Results on Oxford flowers 17 (cont

Results on Oxford flowers 17 (cont

Sparse representation coefficients and reconstruction errors

Color

Shape

Texture

HSV

HOG

SIFTint

SIFTbdy

20 Results on Oxford flowers 102

Results on Oxford flowers 102

Accuracies by Feature Combination

Accuracies by Single Features

NS

SRC

MKL (Nilsback, 2008)

MTJSRC-RKHS

MTJSRC-CG

59.2

70.0

72.8

73.8

74.1

Features

NS

SVM (Nilsback, 2008)

MTJSRC-RKHS (K=1)

MTJSRC-CG (K=1)

HSV

39.8

43.0

43.6

42.5

HOG

34.9

49.6

46.7

48.1

SIFTint

46.6

55.1

54.7

55.2

SIFTbdy

34.1

32.0

33.0

31.6

21 Results on Caltech 101

Results on Caltech 101

Accuracies by Feature Combination (15 train/15 test)

Accuracies by Single Features

NS

SRC

MKL (Varma, 2007)

LPBoost (Gehler, 2009)

MTJSRC-RKHS

MTJSRC-CG

51.7 ± 0.8

69.2 ± 0.7

70.0 ± 1.0

70.7 ± 0.4

71.0 ± 0.3

71.4 ± 0.4

Features

NS

SVM (Varma, 2007)

MTJSRC-RKHS (K=1)

MTJSRC-CG (K=1)

GB

40.8 ± 0.6

62.6 ± 1.2

58.3 ± 0.4

58.5 ± 0.3

PHOW-gray

45.4 ± 0.9

63.9 ± 0.8

65.0 ± 0.7

64.5 ± 0.5

PHOW-color

37.3 ± 0.5

54.5 ± 0.6

56.1 ± 0.5

54.4 ± 0.7

SSIM

39.8 ± 0.8

54.3 ± 0.6

61.8 ± 0.6

59.7 ± 0.4

22 Conclusions

Conclusions

Multi-task joint sparse representation is effective to combine complementary visual features. For single feature, the kernel-view extensions of MTJSRC perform quite competitive to SVM. MTJSRC is free of model training

23 Thank you

Thank you

24 Visual Classification with Multi-Task Joint Sparse Representation

Visual Classification with Multi-Task Joint Sparse Representation

Xiaotong Yuan and Shuicheng Yan

«Visual Classification with Multi-Task Joint Sparse Representation»
http://900igr.net/prezentacija/anglijskij-jazyk/visual-classification-with-multi-task-joint-sparse-representation-121733.html
cсылка на страницу

Без темы

661 презентация
Урок

Английский язык

29 тем
Слайды
900igr.net > Презентации по английскому языку > Без темы > Visual Classification with Multi-Task Joint Sparse Representation