Automatic Speech Recognition
Compression
Deep Learning
Develop
Essay
Generalization
Korean
Machine Learning
Memory
- ∞-former: Infinite Memory Transformer 요약
- Memformer: A Memory-Augmented Transformer for Sequence Modeling 리뷰
- Compressive Transformers for Long-Range Sequence Modelling 리뷰
- Sequential Recommendation with User Memory Networks 리뷰
- Not all memories are created equal: Learning to forget by expiring 리뷰
- Mem2Seq: Effectively Incorporating Knowledge Bases into End-to-End Task-Oriented Dialog Systems 요약
Model Serving
NLP
- AI X Bookathon 4회 후기
- Not all memories are created equal: Learning to forget by expiring 리뷰
- Mem2Seq: Effectively Incorporating Knowledge Bases into End-to-End Task-Oriented Dialog Systems 요약
- huggingface를 이용한 한국어 BART 학습 후기
- Tensorflow2 기반 Seq2Seq 모델, 학습, 서빙 코드 구현
- BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension 리뷰
- What Do Compressed Deep Neural Networks Forget? 리뷰
- Training with Quantization Noise for Extreme Model Compression 리뷰
- 한국어 띄어쓰기 교정 모델 개발 (Quickspacer)