Seminar "Selected Topics in Human Language Technology and Pattern Recognition"
In the winter term 2015/16 the Lehrstuhl für Informatik 6 will host a
seminar entitled "Selected Topics in Human Language Technology and Pattern
Recognition".
Prerequisites for participation in the seminar
- Bachelor students: Einführung in das wissenschaftliche Arbeiten (Proseminar)
- Master students: Bachelor degree
- Diploma students: Vordiplom
- Attendance of the lectures Pattern Recognition and Neural
Networks, Speech Recognition or Statistical Methods in Natural Language
Processing, or evidence of equivalent knowledge.
- For successful participants of the above lectures, the possibility of a seminar
talk is guaranteed.
Seminar format and important dates
The blocks of presentations of the seminar
Selected Topics in Human Language Technology and Pattern Recognition
will take place on:
- Tuesday, 12.01.2016 14:00-16:30 - Language Models and Word Classes
9. Language Model Pruning (Tran; Supervisor: Parnia Bahar)
11. Recurrent Neural Network-based Language Models (Jendrosch; Supervisor: Parnia Bahar)
21. Word Classes (Nguyen; Supervisor: Kazuki Irie)
- Tuesday, 19.01.2016 14:00-17:00 - Neural Networks and Image Captions
18. Convolutional Neural Networks (Esser; Supervisor: Patrick Doetsch)
8. Word Embeddings and Neural Networks (Jiang; Supervisor: Jan-Thorsten Peter)
19. Sequence generation with recurrent neural networks (Nix; Supervisor: Patrick Doetsch)
20. Generating Image Captions (Shafin; Supervisor: Harald Hanselmann)
- Tuesday, 26.01.2016 14:00-17:00 - Machine Translation
1. The IBM Translation Models (Wo; Supervisor: Weiyue Wang)
2. Decoding for Phrase-Based Statistical Machine Translation (Golovin; Supervisor: Weiyue Wang)
3. Machine Translation Evaluation (Linhart; Supervisor: Stephan Peitz)
4. Parameter Tuning for Statistical Machine Translation (Hegselmann; Supervisor: Jan-Thorsten Peter)
- Tuesday, 02.02.2016 14:00-17:00 - Machine Translation
5. Reordering Models for Statistical Machine Translation (Schulte;Supervisor: Yunsu Kim)
6. Factored Machine Translation (Feser; Supervisor: Stephan Peitz)
7. System Combination for Machine Translation (Bock; Supervisor: Weiyue Wang)
12. ADVANCED: Discriminative Training for Machine Translation (Drichel; Supervisor: Yunsu Kim)
- Tuesday, 09.02.2016 14:00-17:00 - Machine Translation
15. ADVANCED: Neural Machine Translation (Rosendahl; Supervisor: Jan-Thorsten Peter)
23. Deciphering Foreign Language (Neumann; Supervisor: Julian Schamper)
Further, please note the following deadlines:
- Proposals: initial proposals will be accepted up
until the start of the term
(October 10, 2015) at the Lehrstuhl für Informatik 6
office or by the relevant supervisor. At this time participants must
arrange an appointment with the relevant supervisor. Revised proposals
will be accepted up until two weeks
after the start of the term.
- Article: must be submitted at least 1 month prior to the trial presentation date
to either the Lehrstuhl für Informatik 6 office or the relevant
supervisor.
- Presentation slides: must be submitted at least 1 week prior to the trial presentation date
to either the Lehrstuhl für Informatik 6 office or the relevant
supervisor.
- Trial presentations: at least 2 weeks prior to the
actual presentation date; refer to the section on topics.
- Seminar presentations: the exact dates and plan for
the presentation block (expected to be around January 2016)
will be arranged and announced for the individual topics.
- Final (possibly corrected) articles and presentation slides:
must be submitted at the latest 4
weeks after the presentation date to either the Lehrstuhl für Informatik 6 office or the relevant supervisor.
- Compulsory attendance: in order to receive a
certificate participants must attend all presentation sessions.
- Ethical Guidelines:The Computer Science
Department of RWTH Aachen University has adopted ethical
guidelines for the authoring of academic work such as seminar
reports. Each student has to comply with these guidelines. In this
regard, you, as a seminar attendant, have to sign a declaration of
compliance, in which you assert that your work complies with the
guidelines, that all references used are properly cited, and that the
report was done autonomously by yourself. We ask you do download the guidelines
and submit the declaration
together with your seminar report and talk to your supervisor.
You also find a German version of the guidelines
and a German version of the declaration you may use as well.
Note: failure to meet deadlines, absence without
permission from compulsory sessions (presentations and preliminary
meeting as announced by email to each participating student), or
dropping out of the seminar after more than 3 weeks after the
preliminary meeting/topic distribution
results in the grade 5.0/not appeared.
Topics, relevant references and participants
Specific topics will be introduced at a preparatory meeting
in the seminar room at the Lehrstuhl für Informatik 6.
In general, selected topics from the following general areas of Human
Language Technology and Pattern Recognition will be offered:
- Automatic Speech Recognition;
- Machine Translation;
- Pattern Recognition.
Some possible topics, supervisors, and basic references:
-
The IBM Translation Models
(Wo; Supervisor: Weiyue Wang)
References:
- Chapter 4 of P. Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- Peter F. Brown, Stephen A. Della Pietra, Vincent J. Della Pietra, and Robert L. Mercer: "The mathematics
of statistical machine translation: Parameter estimation," Computational Linguistics, 19(2):263-311, 1993.
- Franz Josef Och and Hermann Ney: "Improved statistical alignment models," Proceedings of the 38th
Annual Meeting of the Association for Computational Linguistics, 2000.
-
Decoding for Phrase-Based Statistical Machine Translation
(Golovin; Supervisor: Weiyue Wang)
References:
- Chapter 6 of P. Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- P. Koehn: "Pharaoh: a Beam Search Decoder for Phrase-Based Statistical Machine Translation Models,"
Proc. 6th Conference of the Association for Machine Translation in the Americas (AMTA),
Washington, DC, September 2004.
- R. C. Moore, C. Quirk: "Faster Beam-Search Decoding
for Phrasal Statistical Machine Translation,"
Proc. MT Summit XI, Copenhagen, Denmark, September 2007.
- R. Zens, H. Ney: "Improvements in Dynamic Programming Beam Search for Phrase-based Statistical Machine Translation,"
International Workshop on Spoken Language Translation (IWSLT),
Honolulu, Hawaii, October 2008.
-
Machine Translation Evaluation
(Linhart; Supervisor: Stephan Peitz)
References:
- Chapter 8 of P. Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- K. Papineni, S. Roukus, T. Ward and W. Zhu: "BLEU: a Method for Automatic Evaluation of Machine Translation," Proc. ACL, pp. 311-318, Philadelphia, PA, July 2002.
- Matthew Snover, Bonnie Dorr, Richard Schwartz, Linnea Micciulla, John Makhoul: "A Study of Translation Edit Rate with Targeted Human Annotation," In Proceedings AMTA, August 2006, pp. 223-231.
- A. Lavie, A. Agarwal: "METEOR: An Automatic Metric for MT Evaluation with High Levels of Correlation with Human Judgments," Proc. 2nd Workshop on MT, ACL, pp.228-231, Prague, Czech Republic, June 2007.
-
Parameter Tuning for Statistical Machine Translation
(Hegselmann; Supervisor: Jan-Thorsten Peter)
References:
- Chapter 9 of P. Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- J. A. Nelder, R. Mead: "The Downhill Simplex Method," Computer Journal, 7:308, 1965.
- F. J. Och: "Minimum Error Rate Training for Statistical Machine Translation,"
Proc. of the 41st Annual Meeting of the Association for Computational Linguistics (ACL),
pp. 160-167, Sapporo, Japan, July 2003.
- D. Chiang, Y. Marton, P. Resnik: "Online large-margin
training of syntactic and structural translation features,"
Proc. Empirical Methods in Natural Language Processing (EMNLP),
pp. 224-233, Honolulu, Hawaii, October 2009.
- D. Chiang, K. Knight, and W. Wang:
"11,001 New Features for Statistical Machine Translation,"
Proc. NAACL HLT, pp. 218-226, Boulder, Colorado, May 2009.
-
Reordering Models for Statistical Machine Translation
(Schulte; Supervisor: Yunsu Kim)
References:
- Chapter 5.4 of P.Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- C. Tillman. 2004. A unigram orientation model for statistical machine translation, Proceedings of HLT-NAACL: Short Papers, pages 101–104, Boston, MA, May 2004
- M. Galley and C. Manning. A Simple and Effective Hierarchical Phrase Reordering Model, Proceedings of the Conference on Empirical Methods in Natural Language Processing, pp. 848-856, Honolulu, Hawaii, October 2008/li>
-
Factored Machine Translation
(Feser; Supervisor: Stephan Peitz)
References:
- Chapter 10 of P. Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- P. Koehn, H. Hoang: "Factored Translation Models," EMNLP, 2007.
- O. Bojar: "English-to-Czech Factored Machine Translation,"
Proceedings of the Second Workshop on Statistical Machine Translation,
pp. 232-239, Prague, Czech Republic, 2007.
- A. Birch, M. Osborne, and P. Koehn: "CCG Supertags in Factored Statistical Machine Translation,"
ACL Workshop on Statistical Machine Translation, 2007.
- E. Avramidis, P. Koehn: "Enriching Morphologically Poor Languages for Statistical Machine Translation,"
ACL, 2008.
- P. Koehn, M. Federico, W. Shen, N. Bertoldi, O. Bojar, C. Callison-Burch, B. Cowan, C. Dyer, H. Hoang, R. Zens, A. Constantin, C. Moran, and E. Herbst:
"Open Source Toolkit for Statistical Machine Translation: Factored Translation Models and Confusion Network Decoding,"
Technical report, Johns Hopkins University, Center for Speech and Language Processing,
August 2006.
-
System Combination for Machine Translation
(Bock; Supervisor: Weiyue Wang)
References:
- E. Matusov et al.: "System Combination for Machine Translation of Spoken and Written Language", IEEE Transactions on Audio, Speech and Language Processing, Vol. 16(7), pp. 1222-1237, Sep. 2008.
- A. S. Hilderbrand and S Vogel: "Combination of Machine Translation Systems via Hypothesis Selection from Combined N-Best Lists", Proc. AMTA, Hawaii, USA, Oct. 2008.
- A. V. Rosti et al.: "Combining Outputs from Multiple Machine Translation Systems," Proc. NAACL, Rochester, NY, USA, Apr. 2007.
-
Word Embeddings and Neural Networks
(Jiang; Supervisor: Jan-Thorsten Peter)
References:
- Tomas Mikolov, Kai Chen, Greg Corrado, and Jeffrey Dean. "Efficient Estimation of Word Representations in Vector Space". In Proceedings of Workshop at ICLR, 2013.
- Radu Soricut, and Franz Och. "Unsupervised Morphology Induction Using Word Embeddings". In Proceedings of Workshop at NAACL, 2015.
- Using Large Target Vocabulary for Neural Machine Translation:
Sebastien Jean, Kyunghyun Cho, Roland Memisevic, and Yoshua Bengio. "On Using Very Large Target Vocabulary for Neural Machine Translation". CoRR, 2015
- Minh-Thang Luong, Ilya Sutskever, Quoc V. Le, Oriol Vinyals, and Wojciech Zaremba. "Addressing the Rare Word Problem in Neural Machine Translation". ACL 2015
-
Language Model Pruning
(Tran; Supervisor: Parnia Bahar)
References:
- K. Seymore, R. Rosenfeld: "Scalable Backoff Language Models", Proc. Fourth International Conference on Spoken Language (ICSLP) vol. 1, pp. 232-235, Kyoto, Japan, 1996.
- R. Kneser, "Statistical Language Modeling Using A Variable Context Length", Proc. Fourth International Conference on Spoken Language (ICSLP) vol. 1, pp. 494-497, Kyoto, Japan, 1996.
- A. Stolcke, "Entropy Based Pruning Of Backoff Language Models", Proceedings DARPA Broadcast News Transcription and Understanding Workshop, pp. 270-274, Lansdowne, VA, 1998.
- Siivola, T. Hirsimäki, S. Virpioja "On Growing and Pruning Kneser-Ney Smoothed N-gram Models", IEEE Transactions on Audio, Speech and Language Processing, vol 15 no 5, July 2007.
- C. Chelba, T. Brants, W. Neveitt, P. Xu "Study On Interaction Between Entropy Pruning And Kneser-Ney Smoothing", Proceedings of Interspeech 2010, Makuhari, 2422-2425.
- Feed-Forward Neural Network-based Language Models (Supervisor: Parnia Bahar)
References:
- Y. Bengio, R. Ducharme, P. Vincent, C. Jauvain "A Neural Probabilistic Language Model" (2003), in Journal of Machine Learning Research 3 (2005), 1137-1155.
- H. Schwenk, J.-L. Gauvain "Training Neural Network Language Models On Very Large Corpora" in Joint Conference HLT/EMNLP 2005, 201-208
- H. Schwenk: "Continuous Space Language Models," in Computer Speech & Language, Vol. 21, No. 3, pp. 492-518, 2007.
- M. Sundermeyer, I. Oparin, J. Gauvain, B. Freiberg, R. Schlüter, and H. Ney. Comparison of Feedforward and Recurrent Neural Network Language Models. In IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pages 8430-8434, Vancouver, Canada, May 2013.
- Recurrent Neural Network-based Language Models (Jendrosch; Supervisor: Parnia Bahar)
References:
- Y. Bengio, R. Ducharme, P. Vincent, C. Jauvain "A Neural Probabilistic Language Model" (2003), Journal of Machine Learning Research 3 (2005), 1137-1155.
- H. Schwenk, J.-L. Gauvain "Training Neural Network Language Models On Very Large Corpora", Joint Conference HLT/EMNLP 2005, 201-208
- T. Mikolov, M. Karafiát, L. Burget, J. Cernocký, S. Khudanpur: "Recurrent Neural Network based Language Model", Proc. Interspeech 2010, pp. 1045-1048, Makuhari, Chiba, Japan, Sept. 2010.
- T. Mikolov, S. Kombrink, L. Burget, J. Cernocký, S. Khudanpur: "Extensions of Recurrent Neural Network Language Model", Proc. IEEE Intern. Conf. on Acoustics, Speech and Signal Processing (ICASSP), pp. 5528-5531, Prague, Czech Republic, May 2011.
- M. Sundermeyer, H. Ney, and R. Schlüter. From Feedforward to Recurrent LSTM Neural Networks for Language Modeling. IEEE/ACM Transactions on Audio, Speech, and Language Processing, volume 23, number 3, pages 517-529, March 2015.
- M. Sundermeyer, I. Oparin, J. Gauvain, B. Freiberg, R. Schlüter, and H. Ney. Comparison of Feedforward and Recurrent Neural Network Language Models. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP), pages 8430-8434, Vancouver, Canada, May 2013.
-
ADVANCED:
Discriminative Training for Machine Translation
(Drichel; Supervisor: Yunsu Kim)
References:
- Chapter 9 of P. Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- Abraham Ittycheriah and Salim Roukos:
"Direct Translation Model,"
pp. 57-64,
Proc. Human Language Technologies: The Annual Conference of the North American
Chapter of the Association for Computational Linguistics (NAACL-HLT),
Rochester, NY, April 2007.
- Percy Liang, Alexandre Bouchard-Cote, Dan Klein, and Ben Taskar:
"An End-to-End Discriminative Approach to Machine Translation,"
pp. 761-768,
Proc. Joint Conf. of the Int. Committee on Computational Linguistics
and the Association for Computational Linguistics (Coling-ACL),
Sydney, Australia, July 2006.
- D. Chiang, Y. Marton, Ph. Resnik: "Online large-margin
training of syntactic and structural translation features,"
in Proc. Empirical Methods in Natural Language Processing
(EMNLP), pp. 224-233, Honolulu, Hawaii, October 2009.
-
ADVANCED:
Discriminative Alignment for Machine Translation
(Supervisor: Yunsu Kim)
References:
- Ben Taskar, Simon Lacoste-Julien, and Dan Klein:
"A Discriminative Matching Approach to Word Alignment,"
pp. 73--80,
Proc. Human Language Technology Conference
and Conference on Empirical Methods in Natural Language Processing (HLT/EMNLP),
Vancouver, Canada, October 2005.
- Abraham Ittycheriah and Salim Roukos:
"A Maximum Entropy Word Aligner for Arabic-English Machine Translation,"
pp. 89-96,
Proc. Human Language Technology Conference
and Conference on Empirical Methods in Natural Language Processing (HLT/EMNLP),
Vancouver, BC, Canada, October 2005.
- Phil Blunsom and Trevor Cohn:
"Discriminative Word Alignment with Conditional Random Fields,"
pp. 65-72,
Proc. 21st International Conference on Computational Linguistics
and 44th Annual Meeting of the Association for Computational Linguistics,
Sydney, Australia, July 2006.
-
ADVANCED:
N-gram-based Machine Translation
(Supervisor: Andreas Guta)
References:
- J, Marino, R. Banchs, J. Crego, A. Gispert, P. Lambert, J. Fonollosa, and M. Costa-juss:
"N-gram-based Machine Translation",
in Computational Linguistics, Vol. 32(4), pp. 527-549, December 2006.
- Josep Maria Crego and Francois Yvon:
"Improving reordering with linguistically informed bilingual n-grams",
in Proceedings of the 23rd International Conference on Computational Linguistics
(Coling 2010: Posters), pages 197-205, Beijing, China. Coling 2010
-
ADVANCED:
Neural Machine Translation
(Rosendahl; Supervisor: Jan-Thorsten Peter)
References:
- Ilya Sutskever, Oriol Vinyals, and Quoc V. Le
"Sequence to sequence learning with neural networks
NIPS’2014
- Dzmitry Bahdanau, Kyunghyun Cho, Yoshua Bengio
"Neural Machine Translation by Jointly Learning to Align and Translate",
ICLR 2015
- Sébastien Jean, Kyunghyun Cho, Roland Memisevic, Yoshua Bengioo
"On Using Very Large Target Vocabulary for Neural Machine Translation",
ACL 2015
- Multi-Task Learning for Multiple Language Translation
"Daxiang Dong, Hua Wu, Wei He, Dianhai Yu, Haifeng Wang",
ACL 2015
-
ADVANCED:
Operation Sequence Model
(Supervisor: Andreas Guta)
References:
- Nadir Durrani, Helmut Schmid, and Alexander Fraser:
"A joint sequence translation model with integrated reordering".
In Proceedings of the 49th Annual Meeting of the Association for Computational Linguistics:
Human Language Technologies, pages 1045-1054, Portland, Oregon, USA, June 2011.
- N. Durrani, A. Fraser, H. Schmid, and H. Hoang:
"Can Markov Models Over Minimal Translation Units Help Phrase-Based SMT?,"
in Proc. ACL, Sofia, Bulgaria, August 2013.
- N. Durrani, A. Fraser, and H. Schmid:
"Model With Minimal Translation Units, But Decode With Phrases,"
in Proc. NAACL, Atlanta, Georgia, USA, June 2013.
-
ADVANCED:
Syntax-Oriented Hierarchical Translation
(Supervisor: Stephan Peitz)
References:
- Chapter 11 of P. Koehn: "Statistical Machine Translation," textbook, Cambridge University Press, January 2010.
- Ashish Venugopal, Andreas Zollmann, Noah A. Smith, and Stephan Vogel: "Preference Grammars: Softening Syntactic Constraints to
Improve Statistical Machine Translation," Proceedings of Human Language Technologies: The 2009 Annual Conference of the North American
Chapter of the Association for Computational Linguistics, Boulder, Colorado, June 2009.
- Andreas Zollmann and Stephan Vogel: "A Word-Class Approach to Labeling PSCFG Rules for Machine Translation," Proc. of ACL,
Portland, Oregon, 2011.
- Bing Zhao and Yaser Al-Onaizan: "Generalizing Local and Non-Local Word-Reordering Patterns for Syntax-Based Machine Translation,"
EMNLP, 2008.
- Convolutional Neural Networks (Esser; Supervisor: Patrick Doetsch)
References:
- LeCun, Yann, and Yoshua Bengio. "Convolutional networks for images, speech, and time series", The handbook of brain theory and neural networks 3361.10 (1995).
- LeCun, Yann, and Yoshua Bengio. "Word-level training of a handwritten word recognizer based on convolutional neural networks", International Conference on Pattern Recognition. IEEE COMPUTER SOCIETY PRESS, 1994.
- Jaderberg, Max, et al. "Spatial Transformer Networks", arXiv preprint arXiv:1506.02025 (2015).
- Sequences and Recurrent Neural Networks (Nix; Supervisor: Patrick Doetsch)
References:
- Graves, Alex. "Generating sequences with recurrent neural networks", arXiv preprint arXiv:1308.0850 (2013)
- Cho, Kyunghyun, et al. "Learning phrase representations using rnn encoder-decoder for statistical machine translation", arXiv preprint arXiv:1406.1078 (2014)
- Vinyals, Oriol, and Quoc Le. "A Neural Conversational Model.", arXiv preprint arXiv:1506.05869 (2015).
- Generating Image Captions (Shafin; Supervisor: Harald Hanselmann)
References:
- Fang, Hao, et al. "From captions to visual concepts and back", arXiv preprint arXiv:1411.4952 (2014).
- Karpathy, Andrej, and Li Fei-Fei. "Deep visual-semantic alignments for generating image descriptions",arXiv preprint arXiv:1412.2306 (2014).
- Xu, Kelvin, et al. "Show, attend and tell: Neural image caption generation with visual attention",arXiv preprint arXiv:1502.03044 (2015).
- Word Classes (Nguyen; Supervisor: Kazuki Irie)
References:
- P.F. Brown, P.V. Desouza, R.L. Mercer, V.J.D. Pietra, J.C. Lai "Class-based n-gram models of natural language", Computational linguistics, Vol. 18, No. 4, pp. 467–479, 1992. 5, 7
- R. Kneser, H. Ney "Forming word classes by statistical clustering for statistical language modelling", In Contributions to Quantitative Linguistics, pp. 221–226. Springer, 1991. 7
- S. Martin, J. Liermann, H. Ney "Algorithms for bigram and trigram word clustering", Speech communication, Vol. 24, No. 1, pp. 19–37, 1998. 7, 14, 16
- T. Mikolov, K. Chen, G. Corrado, J. Dean "Efficient estimation of word representations in vector space", arXiv preprint arXiv:1301.3781 Vol., 2013. 7, 20
- Word Class Based Language Modelling (Supervisor: Yunsu Kim)
References:
- H. Ney, U. Essen, R. Kneser: "On structuring probabilistic dependences in stochastic language modelling", Computer Speech & Language, Vol. 8, No. 1, pp. 1–38, 1994.
- J. Gao, J. Goodman, J. Miao: "The Use of Clustering Techniques for Language Modeling – Application to Asian Languages”, Computational Linguistics and Chinese Language Processing, Vol. 6, No. 1, February 2001.
- J. Uszkoreit, T. Brants: “Distributed Word Clustering for Large Scale Class-Based Language Modeling in Machine Translation", In 46th Annual Meeting of the Association for Computational Linguistics: Human Language Technologies (ACL-08: HLT), pp. 755–762, Columbus, OH, USA, June 2008.
- A. Bisazza, C. Monz: "Class-based language modeling for translating into morphologically rich languages", In 25th International Conference on Computational Linguistics (COLING 2014), pp. 1918-1927, Dublin, Ireland, August 2014.
- Deciphering Foreign Language (Neumann; Supervisor: Julian Schamper)
References:
- C. E. Shannon: "Communication Theory of Secrecy Systems," Bell System Technical Journal 1949;28(4):656-715. Available at: http://netlab.cs.ucla.edu/wiki/files/shannon1949.pdf.
- K. Knight, A. Nair, N. Rathod, K. Yamada: "Unsupervised analysis for decipherment problems," Proceedings of the COLING/ACL on Main conference poster sessions, 2006:499-506.
- K. Knight: "Attacking decipherment problems optimally with low-order n-gram models," EMNLP 2008 Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing,
Honolulu, Hawaii, USA, October 2008.
- S. Ravi, K. Knight: "Deciphering Foreign Language"
Guidelines for the article and presentation
The roughly 20-page article together with the slides (between 20 &
30) for the presentation should be prepared in LaTeX format.
Presentations will consist of 30 to 45 minutes presentation time & 15
minutes discussion time. Document templates for both the article and
the presentation slides are provided below along with links to LaTeX
documentation available online. The article and
the slides should be prepared in LaTeX format and submitted
electronically in pdf format. Other formats will not be accepted.
- Online LaTeX-Documentation:
- Guidelines for articles and presentation slides:
General:
- The aim of the seminar for the participants is to learn the
following:
- to tackle a topic and to expand knowledge
- to critically analyze the literature
- to hold a presentation
- Take notice of references
to other topics in the seminar and discuss topics with one
another!
- Take care to stay within your
own topic. To this end participants should be aware of the other
topics in the seminar. If applicable, cross-reference
other articles and presentations.
Specific:
- Important: As part of the introduction, a slide should
outline the most important literature used for the presentation. In
addition, the presentation should clearly indicate which literature the particular
elements of the presentation refer to.
- Take notice of references
to other topics in the seminar and discuss topics with one
another!
- Participants are expected to seek out additional literature on their
topic. Assistance with the literature search is available at the
facultys library. Access to literature is naturally also available at
the Lehrstuhl für Informatik 6 library.
- Notation/Mathematical
Formulas: consistent, correct notation
is essential. When necessary, differing notation from various
literature sources is to be modified or standardized in order to be
clear and consistent. The
lectures held by the Lehrstuhl für Informatik 6 should provide a
guide as to what appropriate notation should look like.
- Tables
must have titles (appearing above the table).
- Figures
must have captions (appearing below the figure).
- In the case that no adequate translation of an
English technical term is available, the term should be used unchanged.
- Articles and presentation slides can also be prepared in
English.
- Completeness:
acknowledge all literature and
sources.
- Referencing must conform to the standard
described in the article template.
- Examples should be used to illustrate points.
- Examples should be as complex as necessary but as simple
as possible.
- Slides should be used
as presentation aids and not to replace the role of the presenter;
specifically, slides should:
- illustrate important points and relationships;
- remind the audience (and the presenter) of important aspects
and considerations;
- give the audience an overview
of the presentation.
- Slides should not contain chunks of text or complicated
sentences; rather they should consist of succinct words and terms.
- Use illustrations
where appropriate - a picture says a thousand words!
- Abbreviations should be defined at the first usage in the manner
demonstrated in the following example: "[...] at the
Rheinisch-Westfälischen Technischen Hochschule (RWTH) there are
[...]".
- Take care to stay within your
own topic. To this end participants should be aware of the other topics in the
seminar. If applicable, cross-reference
other articles and presentations.
- Usage of fonts, typefaces and colors in presentation slides must
be consistent and appropriate. Such means should serve to clarify
points or relationships, not be applied needlessly or at random.
- Care should be taken when selecting fonts for presentation
slides (also within diagrams) to ensure legibility on a projector even
for those seated far from the screen.
Contact
Inquiries should be directed to the respective supervisors or to:
Julian Schamper
RWTH Aachen
Lehrstuhl für Informatik 6
Ahornstr. 55
52074 Aachen
Raum 6129
Telefon: 0241 / 80-21615
E-Mail: schamper@cs.rwth-aachen.de