Computational Graph

Computational Graph

Computational Graph


ํ•จ์ˆ˜๋ฅผ ๋‹จ์ˆœํ™”ํ•˜์—ฌ ๊ทธ๋ž˜ํ”„๋กœ ํ‘œํ˜„ํ•˜๋Š” ํ”„๋ ˆ์ž„์›Œํฌ์ด๋‹ค. ์ˆซ์ž์™€ ์—ฐ์‚ฐ์ž๋ฅผ ๋ชจ๋‘ ๋ถ„๋ฆฌํ•˜๋ฉฐ ๋…ธ๋“œ์™€ Input์œผ๋กœ ์ด๋ฃจ์–ด์ ธ์žˆ๋‹ค. computational graph ๋‚ด๋ถ€ ๋ชจ๋“  ๋ณ€์ˆ˜๋“ค์„ ๊ฑฐ์ณ backpropagation๋ฅผ ์‚ฌ์šฉ ๊ฐ€๋Šฅํ•˜๊ฒŒ ํ•œ๋‹ค. ํ•จ์ˆ˜ f์˜ ์ถœ๋ ฅ์— ๋Œ€ํ•œ ์–ด๋–ค ๋ณ€์ˆ˜์˜ gradient๋ฅผ ์ฐพ๊ณ ์‹ถ์„ ๋•Œ ํ•จ์ˆ˜ f๋ฅผ computational graph๋กœ ๋‚˜ํƒ€๋‚ด๋Š” ๊ฒƒ์ด ์ฒซ ๋ฒˆ์งธ step์ด๋‹ค.
Computational Graph ์˜ˆ์‹œ: input์ด x, W์ธ ์„ ํ˜• classifier
Computational Graph ์˜ˆ์‹œ: input์ด x, W์ธ ์„ ํ˜• classifier
ย 
์˜ˆ์‹œ) AlexNet, Neural Turing Machine
ย 

๋ชจ๋“ˆํ™”

Computational Graph์—์„œ ์‹ค์ œ๋กœ ๋ชจ๋“ˆํ™”๋œ ๊ตฌํ˜„์ด๋‹ค. forward/backward๊ณผ์ •์„ ์ž˜ ์ˆ˜ํ–‰ํ•˜๊ธฐ ์œ„ํ•ด์„œ๋Š” ๋ชจ๋“ˆํ™”๊ฐ€ ํ•„์š”ํ•˜๋‹ค.
forward pass์—์„œ๋Š” ๋…ธ๋“œ์˜ ์ถœ๋ ฅ์„ ๊ณ„์‚ฐํ•˜๋Š” ํ•จ์ˆ˜๋ฅผ ๊ตฌํ˜„ํ•˜๋Š”๋ฐ ์ด๋•Œ ์™„์ „ํ•œ ๊ทธ๋ž˜ํ”„๋ฅผ ๊ฐ€์ง€๊ณ  ์žˆ๋‹ค๋ฉด ๊ทธ๋ž˜ํ”„์˜ ๋ชจ๋“  ๋…ธ๋“œ๋ฅผ ๋ฐ˜๋ณตํ•˜์—ฌ forward pass๋กœ ํ†ต๊ณผ์‹œํ‚จ๋‹ค.
backward pass์—์„œ๋Š” chain rule์„ ์ด์šฉํ•ด gradient๋ฅผ ๊ณ„์‚ฐํ•œ๋‹ค.
class ComputationalGraph(object): def forward(inputs): #1. [pass inputs to input gates...] #2. forward the computational graph: for gate in self.graph.modes_topologically_sorted(): gate.forward() retrun loss # the final gate in the graph outputs the loss def backward(): for gate in reversed(self.graph.nodes_topologically_sorted()): gate.backward() #little piece of backprop (chain rule applied) return inputs_gradients
ย