Links

Blogs

Blogposts

Some blogposts that was interesting to read.

Miscellaneous

SAT - Boolean satisfiability problem

Smalltalk

Articles

Software

(Interesting) Programming languages

Declarative machine description

How can we describe computers in a more declarative way? Hardware, virtual machines and programming languages etc. So that more components can be automatically created rather than written manually by a person. This section is about finding references to already published works.

Some tricky challenges in this area:

Rust

Symbolic Execution

Turing Machine simulators

Declarative Programming

Prolog

Datalog

Neurosymbolic programming

Scallop

https://www.scallop-lang.org/ Scallop: From Probabilistic Deductive Databases to Scalable Differentiable Reasoning

Lobster

Lobster: A GPU-Accelerated Framework for Neurosymbolic Programming

Dyna

https://dyna.org/ https://github.com/argolab/dyna3 https://matthewfl.com/research#phd dyna-pi: Dynamic Program Improvement

PyNeuraLogic

PyNeuraLog lets you use Python to create Differentiable Logic Programs PyNeuraLogic documentation GitHub Beyond Graph Neural Networks with Lifter Relational Neural Networks
We demonstrate a declarative differentiable programming framework based on the language of Lifted Relational Neural Networks, where small parameterized logic programs are used to encode relational learning scenarios. When presented with relational data, such as various forms of graphs, the program interpreter dynamically unfolds differentiable computational graphs to be used for the program parameter optimization by standard means. Following from the used declarative Datalog abstraction, this results into compact and elegant learning programs, in contrast with the existing procedural approaches operating directly on the computational graph level. We illustrate how this idea can be used for an efficient encoding of a diverse range of existing advanced neural architectures, with a particular focus on Graph Neural Networks (GNNs). Additionally, we show how the contemporary GNN models can be easily extended towards higher relational expressiveness. In the experiments, we demonstrate correctness and computation efficiency through comparison against specialized GNN deep learning frameworks, while shedding some light on the learning performance of existing GNN models.

Things that have been read

Durable workflow software companies

Things to read in the future

Constructive solid geometry

Operating systems

Software

C programming language (blogposts, etc)

Mesh network implementations

Uncategorized (todo)