Alongside my PhD, I am also working part-time on Named Entity Recognition and Relation Extraction from clinical text under the supervision of Beatrice Alex. Here is a demo of our Named Entity Recognition and Negation Detection models for identifying findings (eg. stroke and tumours) from radiology reports.

In 2017 I finished my Masters at the University of Edinburgh where I specialised in applied Machine Learning and Natural Language Processing.

More specifically, I learned a lot about the dark art of training neural networks and became familiar with the language modelling and dependency parsing literature.

My dissertation, supervised by Adam Lopez and Clara Vania, was on analysing the syntactic structure of sentences using neural network models for languages with rich morphology (demo, code, write up). The results confirmed that for such languages, encoders which construct word representations from characters significantly outperform others that model the input on the word level.

Stack trace


After my M.Sc. I was an R&D Data Scientist at Mudano, where I prototyped NLP & ML algorithms supervised by Euan Wielewski.


Before my M.Sc. I was a research assistant at NCSR Demokritos under the supervision of Natasa Krithara and George Paliouras. I worked on text classification and extracting relations and entity mentions from streams of text. Our submission placed third at Pan 2015 on the Author Profiling task.


Before that I worked on Named Entity Recognition for Greek and Serbian under the supervision of Iraklis Varlamis.


I graduated from Harokopio University of Athens in 2014. For my B.Sc. dissertation, supervised by Rania Hatzi, Mara Nikolaidou and Dimosthenis Anagnostopoulos, I constructed a word similarity measure from synonym graphs with the intent of aligning sentences to recognise textual entailment. The word representations I constructed were influenced by the lectures - reading of the book Gödel, Escher, Bach.


I did my internship in 2013 at Scify under the supervision of George Giannakopoulos, who got me interested in Natural Language Processing.

More text/code with low perplexity under my language model