일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | |||||
3 | 4 | 5 | 6 | 7 | 8 | 9 |
10 | 11 | 12 | 13 | 14 | 15 | 16 |
17 | 18 | 19 | 20 | 21 | 22 | 23 |
24 | 25 | 26 | 27 | 28 | 29 | 30 |
- Blazor
- 파이썬
- 코드업파이썬
- DataScience
- codeup
- 코드업100제
- 파이썬기초100제
- C#
- 블레이저
- 한빛미디어
- GenerativeAI
- 데이터사이언스
- 클라우드
- 파이썬알고리즘
- attention
- nlp
- gcp
- 구글퀵랩
- 생성형AI
- 코드업
- Python
- 알고리즘
- 파이썬기초
- Azure
- 자연어처리
- 데이터분석
- 머신러닝
- 빅데이터
- GenAI
- Microsoft
- Today
- Total
Tech for good
[Google Cloud Skills Boost(Qwiklabs)] Introduction to Generative AI Learning Path - 6. Encoder-Decoder Architecture 본문
[Google Cloud Skills Boost(Qwiklabs)] Introduction to Generative AI Learning Path - 6. Encoder-Decoder Architecture
Diana Kang 2023. 9. 8. 00:58https://www.youtube.com/playlist?list=PLIivdWyY5sqIlLF9JHbyiqzZbib9pFt4x
- Encoding stage
- It produces a vector representation of the input sentence.
- Decoding stage
- It creates the sequence output.
Both the encoder and the decoder can be implemented with different internal architectures.
The internal mechanism can be a recurrent neual network(RNN).
RNN encoder takes each token in the input sequence one at a time and produces a state representing this token as well as all the previously ingested tokens. Then the state is used in the next encoding step as input.
- Decoder generates at each step only the probability that each token in your vocabulary is the next one.
- Using these probabilities, you have to select a word and there are several approaches for that.
- The simplest one called Grid Search is to generate the token that has the highest probability.
- A better approach that produces better results is called Beam Search.
- In that case, you use the probabilities generated by the decoder to evaluate the probability of sentence chunks rather than individual words.