At my German university in the late 1990s we did use a few programming languages - but all of them not for themselves but for demonstration. Modula 2 for introducing higher level concepts such as modules and OOP, VHDL for hardware, PHIGS for graphics (okay not a language), SQL of course, and some C and C++.
We did not truly learn any of those though, it's just that they were used.
The specific languages did not really matter, they were secondary, just some tools. One Theoretical CS professor used some really obscure thing whose name I forgot, really niche, again it was only used to demonstrate some theoretical aspects (provability I think).
Most content was math and formal stuff. Some experiments and low-level stuff, like creating a sound directly from a chip (no computer), in assembler, as part of intro to hardware.
I would have fekt pretty irritated and that I was wasting my time if we had had courses about learning programming languages (designing them, writing compilers etc., is another matter). This is something easily and better done on your own, learning the actual languages.
We did not truly learn any of those though, it's just that they were used.
The specific languages did not really matter, they were secondary, just some tools. One Theoretical CS professor used some really obscure thing whose name I forgot, really niche, again it was only used to demonstrate some theoretical aspects (provability I think).
Most content was math and formal stuff. Some experiments and low-level stuff, like creating a sound directly from a chip (no computer), in assembler, as part of intro to hardware.
I would have fekt pretty irritated and that I was wasting my time if we had had courses about learning programming languages (designing them, writing compilers etc., is another matter). This is something easily and better done on your own, learning the actual languages.