HiWi Positions - Machine Translation

The Lehrstuhl für Informatik 6 of the RWTH Aachen University is looking for Bachelor/Master students (HiWi/WiHi) for research projects in machine translation.

Your tasks:
  • Implementation and testing of new techniques in machine translation
  • Data preparation, translation experiments
  • Translation quality analysis
Your benefits:
  • Getting involved in state-of-the-art research in machine translation
  • Cutting-edge computing environment
  • Chances to write a Bachelor/Master thesis
IMPORTANT: You must work as a HiWi/WiHi student for at least 3 months (in average 9 months) before you get a thesis topic and register it. In this thesis preparation period, you will:
  • get used to essential softwares/tools
  • read and fully understand literature
  • produce initial experimental results
Of course, you can also work as HiWi/WiHi without pursuing a thesis.

Minimum qualifications:
  • Strong programming skills in Python and/or C++
  • Attendance in one of the lectures/seminars/lab courses at the Lehrstuhl für Informatik 6 (this semester or before)
Preferred qualifications:
  • Knowledge in Bash unix shells

To apply, please write an e-mail to:

Yunsu Kim
kim [-at-] cs.rwth-aachen.de

with the followings included:

  • Short CV ("Lebenslauf")
  • Transcription of your grades ("Notenspiegel") in RWTH Aachen
    (if it's your first semester at RWTH, then the grades in your previous university)
  • Your current status: degree program, semester
  • Description of any experience (courses, projects, industry work, etc.) related to our field (machine learning, natural language processing, machine translation)
  • Whether you would like to write a thesis in our chair: if yes, when?
We accept applications on a rolling basis throughout each semester.

Finished/Ongoing Theses


  • Generating Multilingual Sentence Embeddings Using Neural Machine Translation (Hendrik Rosendahl)
  • Unsupervised Learning of Neural Network Lexicon and Cross-lingual Word Embedding (Jiahui Geng)
  • Improving Neural Machine Translation Using Alignment-based Models (Munkhzul Erdenedash)
  • Training of Extended Count-Based Models for SMT (Patrick Wilken)
  • Analyzing Dual Learning in Neural Machine Translation (Pavel Petrushkov)
  • A Hybrid Approach To Neural Machine Translation Search With Phrases (Leonard Dahlmann)
  • Neural Network Based Hidden Markov Model for Statistical Machine Translation (Derui Zhu)
  • Simulation Experiments on Unsupervised Training for Handwriting Recognition (Taras-Svitozar Kucherenko)
  • Improving Alignment-based Neural Machine Translation (Mohammed Hethnawi)
  • Investigation on Neural Network Inputs and Structures for Statistical Machine Translation (Weiyue Wang)
  • Decoding with Recurrent Neural Networks for Phrase-based Statistical Machine Translation (Felix Rietig)
  • Phrase-table Smoothing with Word Classes (Yunsu Kim)

  • Extension of the Attention Mechanism in Neural Machine Translation (Christopher Jan-Steffen Brix)
  • Improving Alignment-based Neural Machine Translation (Gabriel Bretschner)
  • Alignment Methods for Attention-based Neural Machine Translation (Arne Nix)
  • Analysing Attention-based Neural Machine Translation (Hendrik Rosendahl)
  • Training Neural Networks on Parallel Data without Alignment (Nick Rossenbach)
  • Word-based Decoding with Joint Translation and Reordering Models (Miguel Graça)