ํƒœ๊ทธ ๋ณด๊ด€๋ฌผ: eigenvalues

eigenvalues

Andrew Ng๊ฐ€ PCA๋ฅผ ์ˆ˜ํ–‰ํ•˜๊ธฐ ์œ„ํ•ด ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์˜ EIG๊ฐ€ ์•„๋‹Œ SVD๋ฅผ ์„ ํ˜ธํ•˜๋Š” ์ด์œ ๋Š” ๋ฌด์—‡์ž…๋‹ˆ๊นŒ? ์Šคํƒ ํฌ๋“œ NLP ๊ณผ์ •์—์„œ cs224n์˜ ์ฒซ ๋ฒˆ์งธ

Andrew Ng์˜ Coursera ์ฝ”์Šค ๋ฐ ๊ธฐํƒ€ ์ž๋ฃŒ์—์„œ PCA๋ฅผ ๊ณต๋ถ€ํ•˜๊ณ  ์žˆ์Šต๋‹ˆ๋‹ค. ์Šคํƒ ํฌ๋“œ NLP ๊ณผ์ •์—์„œ cs224n์˜ ์ฒซ ๋ฒˆ์งธ ๊ณผ์ œ ์™€ Andrew Ng ์˜ ๊ฐ•์˜ ๋น„๋””์˜ค์—์„œ ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์˜ ๊ณ ์œ  ๋ฒกํ„ฐ ๋ถ„ํ•ด ๋Œ€์‹  ํŠน์ด ๊ฐ’ ๋ถ„ํ•ด๋ฅผ ์ˆ˜ํ–‰ํ•˜๋ฉฐ Ng๋Š” SVD๊ฐ€ ๊ณ ์œ  ๋ถ„ํ•ด๋ณด๋‹ค ์ˆ˜์น˜ ์ ์œผ๋กœ ๋” ์•ˆ์ •์ ์ด๋ผ๊ณ  ๋งํ•ฉ๋‹ˆ๋‹ค.

PCA์˜ ๊ฒฝ์šฐ (m,n)ํฌ๊ธฐ์˜ ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์ด ์•„๋‹Œ ํฌ๊ธฐ ์˜ ๋ฐ์ดํ„ฐ ํ–‰๋ ฌ์˜ SVD๋ฅผ ์ˆ˜ํ–‰ํ•ด์•ผ (n,n)ํ•ฉ๋‹ˆ๋‹ค. ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์˜ ๊ณ ์œ  ๋ฒกํ„ฐ ๋ถ„ํ•ด.

์™œ ๋ฐ์ดํ„ฐ ํ–‰๋ ฌ์ด ์•„๋‹Œ ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์˜ SVD๋ฅผ ์ˆ˜ํ–‰ํ•ฉ๋‹ˆ๊นŒ?



๋‹ต๋ณ€

amoeba๋Š” ์ด๋ฏธ ์˜๊ฒฌ์— ์ข‹์€ ๋Œ€๋‹ต์„ํ–ˆ์ง€๋งŒ ๊ณต์‹์ ์ธ ์ฃผ์žฅ์„ ์›ํ•œ๋‹ค๋ฉด ์—ฌ๊ธฐ์— ๊ฐ„๋‹ค.

ํ–‰๋ ฌ์˜ ํŠน์ด ๊ฐ’ ๋ถ„ํ•ด ์ธ = U ฮฃ V T ์˜ ์—ด, V๋Š” ์˜ ๊ณ ์œ  ๋ฒกํ„ฐ์ด๋‹ค T ๊ทธ๋ฆฌ๊ณ  ๋Œ€๊ฐ์„  ์—”ํŠธ๋ฆฌ ฮฃ ์žˆ๋‹ค ์ œ๊ณฑ๊ทผ ๊ทธ ๊ณ ์œ ์˜, ์ฆ‰ ฯƒ I I = โˆš

์—์ด

A=UฮฃVT

V

ATA

ฮฃ

.

ฯƒii=ฮปi(ATA)

์•„์‹œ๋‹ค์‹œํ”ผ ์ฃผ์„ฑ๋ถ„์€ ๊ฒฝํ—˜์  ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์˜ ๊ณ ์œ  ๋ฒกํ„ฐ ๊ณต๊ฐ„์— ๋Œ€ํ•œ ๋ณ€์ˆ˜์˜ ์ง๊ต ํˆฌ์˜์ž…๋‹ˆ๋‹ค . ์„ฑ๋ถ„์˜ ๋ถ„์‚ฐ์€ ๊ณ ์œ  ๊ฐ’ฮปi(1

1nโˆ’1ATA

.

ฮปi(1nโˆ’1ATA)

B v = ฮป v๊ฐ€ ๋˜๋„๋ก ์ •์‚ฌ๊ฐ ํ–‰๋ ฌ , ฮฑ โˆˆ R ๋ฐ ๋ฒกํ„ฐ v๋ฅผ ๊ณ ๋ คํ•˜์‹ญ์‹œ์˜ค . ๊ทธ๋•Œ

B

ฮฑโˆˆR

v

Bv=ฮปV

  1. Bkv=ฮป์ผ€์ดV


  2. ฮป(ฮฑ๋น„)=ฮฑฮป(๋น„)

S = 1์„ ์ •์˜ํ•˜์ž. SVD์˜S๋Š”์˜ ๊ณ„์‚ฐํ•œ๋‹ค eigendecompositionSTS=1

์—์Šค=1์—”โˆ’1์—์ดํ‹ฐ์—์ด

์—์Šค

๋ฅผ ์‚ฐ์ถœ

STS=1(nโˆ’1)2ATAATA
  1. ์˜ ๊ณ ์œ  ๋ฒกํ„ฐ ์†์„ฑ 1์˜ ๊ฒƒ์ด๋ฉฐ, T
    (ATA)TATA=ATAATA

    ATA

  2. ์ œ๊ณฑ๊ทผ ์˜ ๊ณ ์œ ์˜ , ํŠน์„ฑ 2, 1, 2๋Š” ๋‹ค์‹œโˆš
    1(nโˆ’1)2ATAATA

    . 1(nโˆ’1)2ฮปi(ATAATA)=1(nโˆ’1)2ฮปi2(ATA)=1nโˆ’1ฮปi(ATA)=ฮปi(1nโˆ’1ATA)

oil!

์ˆ˜์น˜ ์•ˆ์ •์„ฑ๊ณผ ๊ด€๋ จํ•˜์—ฌ, ์‚ฌ์šฉ ๋œ ์•Œ๊ณ ๋ฆฌ์ฆ˜์ด ๋ฌด์—‡์ธ์ง€ ์•Œ์•„ ๋‚ด์•ผํ•ฉ๋‹ˆ๋‹ค. ๋‹น์‹ ์ด ๊ทธ๊ฒƒ์— ๋‹ฌ๋ ค ์žˆ๋‹ค๋ฉด, ๋‚˜๋Š” ์ด๊ฒƒ์ด numpy์— ์˜ํ•ด ์‚ฌ์šฉ๋˜๋Š” LAPACK ๋ฃจํ‹ด์ด๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค :

์—…๋ฐ์ดํŠธ : ์•ˆ์ •์„ฑ์—์„œ SVD ๊ตฌํ˜„์€ ๋ถ„ํ•  ๋ฐ ์ •๋ณต ์ ‘๊ทผ๋ฒ•์„ ์‚ฌ์šฉํ•˜๋Š” ๊ฒƒ์ฒ˜๋Ÿผ ๋ณด์ด์ง€๋งŒ ๊ณ ์œ  ๋ถ„ํ•ด๋Š” ์ผ๋ฐ˜ QR ์•Œ๊ณ ๋ฆฌ์ฆ˜์„ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค. ๊ธฐ๊ด€์—์„œ ์ œ๊ณตํ•˜๋Š” ๊ด€๋ จ SIAM ๋…ผ๋ฌธ (๋น„๋‚œ ์—ฐ๊ตฌ ์‚ญ๊ฐ)์— ์•ก์„ธ์Šค ํ•  ์ˆ˜ ์—†์ง€๋งŒ SVD ๋ฃจํ‹ด์ด ๋” ์•ˆ์ •์ ์ด๋ผ๋Š” ํ‰๊ฐ€๋ฅผ ๋’ท๋ฐ›์นจ ํ•  ์ˆ˜์žˆ๋Š” ๊ฒƒ์„ ๋ฐœ๊ฒฌํ–ˆ์Šต๋‹ˆ๋‹ค.

์—์„œ

Nakatsukasa, Yuji ๋ฐ Nicholas J. Higham. โ€œ๋Œ€์นญ ๊ณ ์œ  ๊ฐ’ ๋ถ„ํ•ด ๋ฐ SVD๋ฅผ์œ„ํ•œ ์•ˆ์ •์ ์ด๊ณ  ํšจ์œจ์ ์ธ ์ŠคํŽ™ํŠธ๋Ÿผ ๋ถ„ํ•  ๋ฐ ์ •๋ณต ์•Œ๊ณ ๋ฆฌ์ฆ˜.โ€ ๊ณผํ•™ ์ปดํ“จํŒ…์— ๊ด€ํ•œ SIAM Journal 35.3 (2013) : A1325-A1349.

๊ทธ๋“ค์€ ๋‹ค์–‘ํ•œ ๊ณ ์œ  ๊ฐ’ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์•ˆ์ •์„ฑ์„ ๋น„๊ตํ•˜๋ฉฐ, ๋ถ„ํ•  ๋ฐ ์ •๋ณต ์ ‘๊ทผ๋ฒ• (์‹คํ—˜ ์ค‘ ํ•˜๋‚˜์—์„œ numpy์™€ ๋™์ผํ•œ ๋ฐฉ์‹์„ ์‚ฌ์šฉํ•ฉ๋‹ˆ๋‹ค!)์ด QR ์•Œ๊ณ ๋ฆฌ์ฆ˜๋ณด๋‹ค ๋” ์•ˆ์ •์ ์ธ ๊ฒƒ์œผ๋กœ ๋ณด์ž…๋‹ˆ๋‹ค. ์ด๊ฒƒ์€ D & C ๋ฐฉ๋ฒ•์ด ๋” ์•ˆ์ •์ ์ด๋ผ๋Š” ๋‹ค๋ฅธ ์ฃผ์žฅ๊ณผ ํ•จ๊ป˜ Ng์˜ ์„ ํƒ์„ ๋’ท๋ฐ›์นจํ•ฉ๋‹ˆ๋‹ค.


๋‹ต๋ณ€

@amoeba ํฌํ•จ PCA ์งˆ๋ฌธ์— ํ›Œ๋ฅญํ•œ ๋Œ€๋‹ตํ–ˆ๋‹ค ์ด ํ•œ PCA์— SVD์˜ ๊ด€๊ณ„์—์žˆ๋‹ค. ์ •ํ™•ํ•œ ์งˆ๋ฌธ์— ๋Œ€๋‹ตํ•˜๋ฉด ์„ธ ๊ฐ€์ง€ ์ ์„ ์•Œ๋ ค ๋“œ๋ฆฌ๊ฒ ์Šต๋‹ˆ๋‹ค.

  • ์ˆ˜ํ•™์ ์œผ๋กœ ๋ฐ์ดํ„ฐ ํ–‰๋ ฌ์—์„œ ์ง์ ‘ ๋˜๋Š” ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์—์„œ PCA๋ฅผ ๊ณ„์‚ฐํ•˜๋Š”์ง€ ์—ฌ๋ถ€์—๋Š” ์ฐจ์ด๊ฐ€ ์—†์Šต๋‹ˆ๋‹ค.
  • ๊ทธ ์ฐจ์ด๋Š” ์ˆœ์ „ํžˆ ์ˆ˜์น˜ ์ •๋ฐ€๋„์™€ ๋ณต์žก์„ฑ ๋•Œ๋ฌธ์ž…๋‹ˆ๋‹ค. SVD๋ฅผ ๋ฐ์ดํ„ฐ ํ–‰๋ ฌ์— ์ง์ ‘ ์ ์šฉํ•˜๋Š” ๊ฒƒ์€ ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ๋ณด๋‹ค ์ˆ˜์น˜ ์ ์œผ๋กœ ๋” ์•ˆ์ •์ ์ž…๋‹ˆ๋‹ค.
  • SVD๋ฅผ ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์— ์ ์šฉํ•˜์—ฌ PCA๋ฅผ ์ˆ˜ํ–‰ํ•˜๊ฑฐ๋‚˜ ๊ณ ์œ  ๊ฐ’์„ ์–ป์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์‹ค์ œ๋กœ ๊ณ ์œ  ๋ฌธ์ œ๋ฅผ ํ•ด๊ฒฐํ•˜๋Š” ๊ฒƒ์ด ๊ฐ€์žฅ ์ข‹์•„ํ•˜๋Š” ๋ฐฉ๋ฒ•์ž…๋‹ˆ๋‹ค.

SVD๋Š” ํŠนํžˆ ๋จธ์‹  ๋Ÿฌ๋‹์˜ ์ผ๋ฐ˜์ ์ธ ๊ณ ์œ  ๊ฐ’ ๊ฐ์†Œ ์ ˆ์ฐจ๋ณด๋‹ค ์•ˆ์ •์ ์ž…๋‹ˆ๋‹ค. ๋จธ์‹  ๋Ÿฌ๋‹์—์„œ๋Š” ๊ณต ์„ ํ˜• ํšŒ๊ท€ ๋ถ„์„์„ ์‰ฝ๊ฒŒ ์ˆ˜ํ–‰ ํ•  ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์ด ๊ฒฝ์šฐ SVD๊ฐ€ ๋” ์ž˜ ์ž‘๋™ํ•ฉ๋‹ˆ๋‹ค.

์š”์ ์„ ์‹œ์—ฐํ•˜๋Š” Python ์ฝ”๋“œ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค. ๋‚˜๋Š” ๋งค์šฐ ๊ณต์„ ์ ์ธ ๋ฐ์ดํ„ฐ ํ–‰๋ ฌ์„ ๋งŒ๋“ค๊ณ  ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์„ ๊ฐ€์ ธ ์™€์„œ ํ›„์ž์˜ ๊ณ ์œ  ๊ฐ’์„ ์–ป์œผ๋ ค๊ณ ํ–ˆ์Šต๋‹ˆ๋‹ค. SVD๋Š” ์—ฌ์ „ํžˆ ์ž‘๋™ํ•˜์ง€๋งŒ์ด ๊ฒฝ์šฐ ์ผ๋ฐ˜์ ์ธ ๊ณ ์œ  ๋ถ„ํ•ด๋Š” ์‹คํŒจํ•ฉ๋‹ˆ๋‹ค.

import numpy as np
import math
from numpy import linalg as LA

np.random.seed(1)

# create the highly collinear series
T = 1000
X = np.random.rand(T,2)
eps = 1e-11
X[:,1] = X[:,0] + eps*X[:,1]

C = np.cov(np.transpose(X))
print('Cov: ',C)

U, s, V = LA.svd(C)
print('SVDs: ',s)

w, v = LA.eig(C)
print('eigen vals: ',w)

์‚ฐ์ถœ:

Cov:  [[ 0.08311516  0.08311516]
 [ 0.08311516  0.08311516]]
SVDs:  [  1.66230312e-01   5.66687522e-18]
eigen vals:  [ 0.          0.16623031]

์ตœ์‹  ์ •๋ณด

Federico Poloni์˜ ์˜๊ฒฌ์— ๋”ฐ๋ผ SVD์™€ Eig์˜ ์•ˆ์ •์„ฑ ํ…Œ์ŠคํŠธ๊ฐ€ ์œ„์˜ ๋™์ผํ•œ ํ–‰๋ ฌ์˜ ๋ฌด์ž‘์œ„ ์ƒ˜ํ”Œ 1000 ๊ฐœ์— ๋Œ€ํ•œ ์ฝ”๋“œ๊ฐ€ ์žˆ์Šต๋‹ˆ๋‹ค. ๋งŽ์€ ๊ฒฝ์šฐ์—, Eig๋Š” 0์˜ ์ž‘์€ ๊ณ ์œ  ๊ฐ’์„ ๋ณด์—ฌ์ฃผ๋Š”๋ฐ, ์ด๋Š” ํ–‰๋ ฌ์˜ ํŠน์ด์„ฑ์„ ์ดˆ๋ž˜ํ•  ๊ฒƒ์ž…๋‹ˆ๋‹ค. ๊ทธ๋ฆฌ๊ณ  SVD๋Š” ์—ฌ๊ธฐ์„œํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค. SVD๋Š” ์ž‘์€ ๊ณ ์œ  ๊ฐ’ ๊ฒฐ์ •์—์„œ ๋‘ ๋ฐฐ ์ •๋„ ๋” ์ •ํ™•ํ•˜๋ฉฐ, ๋ฌธ์ œ์— ๋”ฐ๋ผ ์ค‘์š”ํ•˜๊ฑฐ๋‚˜ ์ค‘์š”ํ•˜์ง€ ์•Š์„ ์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค.

import numpy as np
import math
from scipy.linalg import toeplitz
from numpy import linalg as LA

np.random.seed(1)

# create the highly collinear series
T = 100
p = 2
eps = 1e-8

m = 1000 # simulations
err = np.ones((m,2)) # accuracy of small eig value
for j in range(m):
    u = np.random.rand(T,p)
    X = np.ones(u.shape)
    X[:,0] = u[:,0]
    for i in range(1,p):
        X[:,i] = eps*u[:,i]+u[:,0]

    C = np.cov(np.transpose(X))

    U, s, V = LA.svd(C)

    w, v = LA.eig(C)

    # true eigen values
    te = eps**2/2 * np.var(u[:,1])*(1-np.corrcoef(u,rowvar=False)[0,1]**2)
    err[j,0] = s[p-1] - te
    err[j,1] = np.amin(w) - te


print('Cov: ',C)
print('SVDs: ',s)
print('eigen vals: ',w)
print('true small eigenvals: ',te)

acc = np.mean(np.abs(err),axis=0)
print("small eigenval, accuracy SVD, Eig: ",acc[0]/te,acc[1]/te)

์‚ฐ์ถœ:

Cov:  [[ 0.09189421  0.09189421]
 [ 0.09189421  0.09189421]]
SVDs:  [ 0.18378843  0.        ]
eigen vals:  [  1.38777878e-17   1.83788428e-01]
true small eigenvals:  4.02633695086e-18
small eigenval, accuracy SVD, Eig:  2.43114702041 3.31970128319

x1=ux2=u+ฮตv

u,v

(ฯƒ12ฯƒ12+ฮตฯฯƒ1ฯƒ2ฯƒ12+ฮตฯฯƒ1ฯƒ2ฯƒ12+2ฮตฯฯƒ1ฯƒ2+ฮต2ฯƒ22ฯƒ2)

ฯƒ12,ฯƒ22,ฯ

โ€“ ์œ ๋‹ˆํผ๊ณผ ๊ทธ๋“ค ์‚ฌ์ด์˜ ์ƒ๊ด€ ๊ด€๊ณ„ coeffient์˜ ํŽธ์ฐจ๋ฅผ.

ฮป=12(ฯƒ22ฮต2โˆ’ฯƒ24ฮต4+4ฯƒ23ฯฯƒ1ฮต3+8ฯƒ22ฯ2ฯƒ12ฮต2+8ฯƒ2ฯฯƒ13ฮต+4ฯƒ14+2ฯƒ2ฯฯƒ1ฮต+2ฯƒ12)


The small eigenvalue canโ€™t be calculated by simply plugging the

ฮต

into formula due to limited precision, so you need to Taylor expand it:

ฮปโ‰ˆฯƒ22ฮต2(1โˆ’ฯ2)/2

I run

j=1,โ€ฆ,m

simulations of the realizations of the data matrix, calculate the eigenvalues of the simulated covariance matrix

ฮป^j

, and obtain the errors

ej=ฮปโˆ’ฮป^j

.


๋‹ต๋ณ€

For Python users, Iโ€™d like to point out that for symmetric matrices (like the covariance matrix), it is better to use numpy.linalg.eigh function instead of a general numpy.linalg.eig function.

eigh is 9-10 times faster than eig on my computer (regardless of matrix size) and has better accuracy (based on @Aksakalโ€™s accuracy test).

I am not convinced with the demonstration of the accuracy benefit of SVD with small eigenvalues. @Aksakalโ€™s test is 1-2 orders of magnitude more sensitive to random state than to the algorithm (try plotting all errors instead of reducing them to one absolute maximum). It means that small errors in the covariance matrix will have a greater effect on accuracy than the choice of an eigendecomposition algorithm. Also, this is not related to the main question, which is about PCA. The smallest components are ignored in PCA.

A similar argument can be made about numerical stability. If I have to use the covariance matrix method for PCA, I would decompose it with eigh instead of svd. If it fails (which has not been demonstrated here yet), then it is probably worth rethinking the problem that you are trying to solve before starting to look for a better algorithm.


๋‹ต๋ณ€

์งˆ๋ฌธ์˜ ๋งˆ์ง€๋ง‰ ๋ถ€๋ถ„์— ๋‹ตํ•˜๊ธฐ ์œ„ํ•ด โ€œ์™œ ๊ทธ๋“ค์€ ๋ฐ์ดํ„ฐ ํ–‰๋ ฌ์ด ์•„๋‹Œ ๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์˜ SVD๋ฅผ ์ˆ˜ํ–‰ํ•ฉ๋‹ˆ๊นŒ?โ€ ๋‚˜๋Š” ๊ทธ๊ฒƒ์ด ์„ฑ๋Šฅ ๋ฐ ์ €์žฅ์ƒ์˜ ์ด์œ ๋ผ๊ณ  ์ƒ๊ฐํ•ฉ๋‹ˆ๋‹ค. ์ผ๋ฐ˜์ ์œผ๋กœ

์— 

๋งค์šฐ ํฐ ์ˆซ์ž๊ฐ€ ๋ ์ง€๋ผ๋„

์—”

ํฌ๋‹ค, ์šฐ๋ฆฌ๋Š” ๊ธฐ๋Œ€ํ•  ๊ฒƒ์ด๋‹ค

์— โ‰ซ์—”

.

๊ณต๋ถ„์‚ฐ ํ–‰๋ ฌ์„ ๊ณ„์‚ฐ ํ•œ ๋‹ค์Œ SVD๋ฅผ ์ˆ˜ํ–‰ํ•˜๋Š” ๊ฒƒ์€ ๋™์ผํ•œ ๊ฒฐ๊ณผ๋ฅผ ์œ„ํ•ด ์ด๋Ÿฌํ•œ ์กฐ๊ฑด์—์„œ ์ „์ฒด ๋ฐ์ดํ„ฐ ํ–‰๋ ฌ์— ๋Œ€ํ•œ SVD๋ฅผ ๊ณ„์‚ฐํ•˜๋Š” ๊ฒƒ๋ณด๋‹ค ํ›จ์”ฌ ๋น ๋ฆ…๋‹ˆ๋‹ค.

์ƒ๋‹นํžˆ ์ž‘์€ ๊ฐ’์ด๋ผ๋„ ์„ฑ๋Šฅ ํ–ฅ์ƒ์€ ์ˆ˜์ฒœ ๋ฐฐ (๋ฐ€๋ฆฌ ์ดˆ ๋Œ€ ์ดˆ)์ž…๋‹ˆ๋‹ค. Matlab์„ ์‚ฌ์šฉํ•˜์—ฌ ๋น„๊ตํ•˜๊ธฐ ์œ„ํ•ด ์ปดํ“จํ„ฐ์—์„œ ๋ช‡ ๊ฐ€์ง€ ํ…Œ์ŠคํŠธ๋ฅผ ์‹คํ–‰ํ–ˆ์Šต๋‹ˆ๋‹ค.

๊ทธ๊ฒƒ์€ CPU ์‹œ๊ฐ„ ์ผ ๋ฟ์ด์ง€ ๋งŒ ์Šคํ† ๋ฆฌ์ง€ ์š”๊ตฌ๋Š” ๊ทธ๋‹ค์ง€ ์ค‘์š”ํ•˜์ง€ ์•Š์Šต๋‹ˆ๋‹ค. Matlab์—์„œ ๋ฐฑ๋งŒ x ์ฒœ ๊ฐœ์˜ ๋งคํŠธ๋ฆญ์Šค์—์„œ SVD๋ฅผ ์‹œ๋„ํ•˜๋ฉด 7.4TB์˜ ์ž‘์—… ๋ฐฐ์—ด ํฌ๊ธฐ๊ฐ€ ํ•„์š”ํ•˜๊ธฐ ๋•Œ๋ฌธ์— ๊ธฐ๋ณธ์ ์œผ๋กœ ์˜ค๋ฅ˜๊ฐ€ ๋ฐœ์ƒํ•ฉ๋‹ˆ๋‹ค.


๋‹ต๋ณ€