일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | 4 | 5 | ||
6 | 7 | 8 | 9 | 10 | 11 | 12 |
13 | 14 | 15 | 16 | 17 | 18 | 19 |
20 | 21 | 22 | 23 | 24 | 25 | 26 |
27 | 28 | 29 | 30 |
- 코드업
- 자연어처리
- stratascratch
- slidingwindow
- 리트코드
- Python
- gcp
- 투포인터
- heap
- 파이썬기초100제
- sql코테
- GenerativeAI
- medium
- Microsoft
- codeup
- 파이썬
- GenAI
- two-pointer
- LeetCode
- 니트코드
- Python3
- 구글퀵랩
- 릿코드
- 슬라이딩윈도우
- 파이썬알고리즘
- 생성형AI
- dfs
- nlp
- SQL
- 알고리즘
- Today
- Total
Tech for good
[Google Cloud Skills Boost(Qwiklabs)] Introduction to Generative AI Learning Path - 6. Encoder-Decoder Architecture 본문
[Google Cloud Skills Boost(Qwiklabs)] Introduction to Generative AI Learning Path - 6. Encoder-Decoder Architecture
Diana Kang 2023. 9. 8. 00:58https://www.youtube.com/playlist?list=PLIivdWyY5sqIlLF9JHbyiqzZbib9pFt4x
Generative AI Learning Path
https://goo.gle/LearnGenAI
www.youtube.com
- Encoding stage
- It produces a vector representation of the input sentence.
- Decoding stage
- It creates the sequence output.
Both the encoder and the decoder can be implemented with different internal architectures.
The internal mechanism can be a recurrent neual network(RNN).
RNN encoder takes each token in the input sequence one at a time and produces a state representing this token as well as all the previously ingested tokens. Then the state is used in the next encoding step as input.
- Decoder generates at each step only the probability that each token in your vocabulary is the next one.
- Using these probabilities, you have to select a word and there are several approaches for that.
- The simplest one called Grid Search is to generate the token that has the highest probability.
- A better approach that produces better results is called Beam Search.
- In that case, you use the probabilities generated by the decoder to evaluate the probability of sentence chunks rather than individual words.