- Research Vision
This is a short essay that summarizes your concerns about the future direction of research. (앞으로의 연구 방향에 관한 고민을 정리한 짧은 에세이입니다.)
Read More
- ∞-former: Infinite Memory Transformer 요약
∞-former: Infinite Memory Transformer 논문을 읽고 한국어로 요약했습니다.
Read More
- Memformer: A Memory-Augmented Transformer for Sequence Modeling 리뷰
Memformer: A Memory-Augmented Transformer for Sequence Modeling 논문을 요약하고 간단히 리뷰했습니다.
Read More
- Compressive Transformers for Long-Range Sequence Modelling 리뷰
Compressive Transformers for Long-Range Sequence Modelling 논문을 요약하고 간단히 리뷰했습니다.
Read More
- Sequential Recommendation with User Memory Networks 리뷰
Sequential Recommendation with User Memory Networks 논문을 요약하고 간단히 리뷰했습니다.
Read More
- Not all memories are created equal: Learning to forget by expiring 리뷰
Not all memories are created equal: Learning to forget by expiring 논문을 요약하고 간단히 리뷰했습니다.
Read More
- Mem2Seq: Effectively Incorporating Knowledge Bases into End-to-End Task-Oriented Dialog Systems 요약
Mem2Seq: Effectively Incorporating Knowledge Bases into End-to-End Task-Oriented Dialog Systems 논문을 요약해보았습니다.
Read More
- Decision Transformer: Reinforcement Learning via Sequence Modeling 요약
Decision Transformer: Reinforcement Learning via Sequence Modeling 논문을 요약해보았습니다.
Read More
- TorchServe 사용법과 후기
TorchServe의 간략한 사용법과 전반적인 사용후기를 정리하였습니다.
Read More
- DOING MORE WITH LESS: IMPROVING ROBUSTNESS USING GENERATED DATA 리뷰
DeepMind의 DOING MORE WITH LESS: IMPROVING ROBUSTNESS USING GENERATED DATA 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- Neural Turing Machines 리뷰
DeepMind의 Neural Turing Machines 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension 리뷰
Facebook의 BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- Conformer: Convolution-augmented Transformer for Speech Recognition 리뷰
Google의 Conformer: Convolution-augmented Transformer for Speech Recognition 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- Exploring Generalization in Deep Learning 리뷰
Exploring Generalization in Deep Learning 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- Sequence Transduction with Recurrent Neural Networks 리뷰
Sequence Transduction with Recurrent Neural Networks 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- Listen, Attend and Spell 리뷰
Listen, Attend and Spell 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- What Do Compressed Deep Neural Networks Forget? 리뷰
Google의 What Do Compressed Deep Neural Networks Forget? 논문을 리뷰하고 내용을 요약해보았습니다.
Read More
- Training with Quantization Noise for Extreme Model Compression 리뷰
Facebook AI Research의 Training with Quantization Noise for Extreme Model Compression 논문을 리뷰하고 내용을 요약해보았습니다.
Read More