Exploring Sequence-to-Sequence (Seq2Seq) models for automated language translation.
Machine Translation is the "Holy Grail" of linguistics. This project investigates the efficacy of Neural Networks in translating text between [Language A] and [Language B]. It explores the challenges of BLEU scores, attention mechanisms, and linguistic nuance.
- Preprocessing: Tokenization, Padding, and Embedding layers.
- Architecture: Encoder-Decoder LSTM/GRU network.
- Evaluation: Measured accuracy using standard BLEU metrics.
This repository contains both the implementation code and a detailed essay discussing the evolution from Statistical Machine Translation (SMT) to NMT.