ILLC Code on Github
This is a list of Github repositories created or maintained by current or former members of the Institute for Logic, Language and Computation (ILLC). Current versions of the repos are also available here.
1. Epistemology & Philosophy of Science (EPS)
- Concepts in Motion
- Code for Concepts in Motion and Vici-project Ideas at Scale (url)
- A SWI-Prolog interface to Hypothesis (url)
- Official code for the paper Een corpus waar alle constructies in gevonden zouden moeten kunnen worden? (url)
- Distributional Semantics for Neo-Latin (url)
- QuiNE-GT 1.0 (url)
- Evaluating the consistency of word embeddings from small data (url)
2. Formal Semantics & Philosophical Logic (FSPL)
- Official code for the paper The evolution of denial, British Journal for the Philosophy of Science (url)
- SignLab (url)
3. Language & Music Cognition (LMC)
- Logic, Language, Computation, and Cognition Group
- Official code for the paper Learnability and Semantic Universals (url)
- Official code for the paper Ease of Learning Explains Semantic Universals (url)
- Official code for the paper An Explanation of the Veridical Uniformity Universal (url)
- Official code for the paper Neural Models of the Psychosemantics of ‘Most’ (url)
- Simplicity vs. Informativeness trade-off for quantifiers (url)
- Signaling games with function words amidst contextual variability (url)
- The emergence of monotone quantifiers via iterated learning (url)
- Official code for the paper Complexity/informativeness trade/off in the domain of indefinite pronouns (url)
- Music Cognition Group
- Contrastive Learning of Musical Representations (url)
- Jackdaw: Jackdaw is a Common Lisp framework for defining discrete dynamic Bayesian networks with deterministic constraints in a way that involves writing very little code (url)
- This package contains support functions for Computational Musicology at the University of Amsterdam (url)
4. Mathematical & Computational Logic (MCL)
- Leapfrog: certified equivalence for protocol parsers (Coq formalization) (url)
5. Natural Language Processing & Digital Humanities (NLP&DH)
- Dialogue Modelling Group (url)
- I-Machine-Think website (url)
- diagNNose: his library contains a set of modules that can be used to analyse the activations of neural networks (url)
- This repository contains a two-stage grammar induction setup for analysing languages emerging in referential and other games (url)
- This repository contains data and scrips to use the tests from the compositionality evaluation paradigm described in the paper (url)
- Official code for the paper Single Headed Attention RNN: Stop Thinking With Your Head (url)
- Generalised Contextual Decomposition for Language Models (url)
- Official implementation of the Seq2Attn architecture for sequence-to-sequence task (url)
- This is a pytorch implementation of a sequence to sequence learning toolkit for the i-machine-think project (url)
- Version alpha for the project sygnal (url)
- matrics - Machine Metrics: A library of common NLP / compositionality metrics (url)
- Assessing Incrementality in sequence-to-sequence models (url)
- A list of resources dedicated to compositionality (url)
- Datasets for compositional learning (url)
- Attentive Guidance (url)
- Statistical language processing and learning lab (url)
- Official code for the paper All Fragments Count in Parser Evaluation (url)
- Auto-Encoding Variational Neural Machine Translation (PyTorch) (url)
- BEER 2.0 (url)
- Discontinuous DOP (url)
- Effective Estimation of Deep Generative Language Models (url)
- Extensions to torch distributions (url)
- Grasp – Randomised Semiring Parsing (url)
- Implementation of Deep Generative Model for Joint Alignment and Word Representation (url)
- Interpretable Neural Predictions with Differentiable Binary Variables (url)
- Material for a tutorial on variational inference for NLP audiences (url)
- Pytorch implementation of Block Neural Autoregressive Flow (url)
- The Power Spherical distribution (url)
- Probabilistic Language Learning Group (url)
- Sparse distributions compatible with
torch.distributions
: code and related paper - Constrained optimisation in pytorch: code and example
- Block neural auto-regressive flows: code and paper
- Joint generative model of tanslation: code and paper
- Jupyter notebooks for continuous and discrete latent-variable VAEs: code
- Deep generative model for joint alignment and word representation: code and paper
- Sparse distributions compatible with
- Open Raadsinformatie API (url)
- Papers
- Official code for the paper The Pragmatics behind Politics: Modelling Metaphor, Framing and Emotion in Political Discourse (url)
- Official code for the paper Meta-Learning with Sparse Experience Replay for Lifelong Language Learning (url)
- Official code for the paper Learning to Learn to Disambiguate: Meta-Learning for Few-Shot Word Sense Disambiguation (url)
- Official code for the paper Graph-based Modeling of Online Communities for Fake News Detection (url)
- Official code for the paper Wikipedia entities as rendezvous across languages: grounding multilingual LMs by predicting wikipedia hyperlinks (url)
6. Theoretical Computer Science (TCS)
- Quantum Information @ Amsterdam (url)
- A Python package for rigorous free fermion entanglement renormalization from wavelet theory (url)
- A SageMath package for computing moment polytopes associated with finite-dimensional representations of compact and connected Lie groups (url)
- This is a Python module and Jupyter notebook for tensor scaling, computing entanglement polytopes, and solving the one-body quantum marginal problem (url)
- A Maple package for computing Kronecker coefficients g(λ,μ,ν) (url)
- Efficiently compute Kronecker coefficients of bounded height (url)
- Python code to calculate the entanglement fidelity and success probability of certain port-based teleportation protocols, as well as their asymptotics (url)
- Computational Social Choice
- Dynamic Epistemic Logic
- A symbolic model checker for Dynamic Epistemic Logic (url)
If you wish to add a repository to this list and to this collection of repositories, please send an email to rdm-illc@uva.nl.