Deep learning may fit in the prolog paradigm in terms of finding programs to compute the output from the input. I am sure people are working on it, it is one aspect of differentiable programming.. This is similar to functional approaches as well. The search here is for the function f that computes f(A)=B ... Today we specify the state space as a CNN or transformer models. Prolog knows how to do deductions from logical statements so you'd need something similar for mapping f I guess.
>> Deep learning may fit in the prolog paradigm in terms of finding programs to compute the output from the input.
That's interesting, but my hunch is that you'd need two deep learning models to do what a single Prolog program could do: one model to map inputs to outputs and one to map outputs to inputs.
One way to describe deep neural nets is to say that they are function approximators. Now, functions have well-defined inputs and outputs, although I don't think that's actual mathematical terminology. Functions are basically mappings from the elements of one or more sets _to_ the elements of another (or more) sets. But a function mapping is uni-directional. If you have a function ƒ: X → Y, it maps elements of X to Y, but it doesn't map elements of Y to X.
So for instance, if you have a function that maps the set of integers from 1 to 24 to the set of letters in the English alphabet, you don't simultaneously have a function that maps letters to integers- you need another function. This is true both in mathematical terms and in programming terms.
Suppose for instance that you have a Pseudocode function with signature:
int_char(int: n) -> char: c
You can call int_char() as:
int_char(2)
And get out "b", but you can't call it the other way:
int_char("b")
And expect to get 2 in return.
In Prolog on the other hand there are no functions. Rather, every expression is a predicate, or in other words an n-ary relation. Now, the concept of relation is a generalisation of the concept of function, so in Prolog you could write int_char() above with the signature:
int_char(N,C).
And call it with either or both arguments instantiated... or none:
?- int_char(2,C).
C = b.
?- int_char(N,b).
N = 2.
?- int_char(2,b).
true.
?- int_char(N,C).
N = 1, C = a ;
N = 2, C = b ;
N = 3, C = c ;
% ... etc
Which you can't easily do with other languages. Well, you can do it in, say, C if you write a Prolog interpeter in C :-)
Anyway, deep learning learns functions, not relations, so its models can't go back-and-forth between sets in mappings. In theory anyway; deep neural net often don't really care about theory.